2026-03-09T21:50:59.418 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-09T21:50:59.436 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T21:50:59.481 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/668 branch: squid description: orch/cephadm/workunits/{0-distro/ubuntu_22.04 agent/off mon_election/classic task/test_cephadm} email: null first_in_suite: false flavor: default job_id: '668' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-09_11:23:05-orch-squid-none-default-vps no_nested_subset: false os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: global: mon election default strategy: 1 mgr: debug mgr: 20 debug ms: 1 mgr/cephadm/use_agent: false mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - client.0 seed: 3443 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm11.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCu52yw/o/RyIYJrtfPQNfS64DG5f7kBbbTE+pdj3iWbYXKiWomdypzDGnV3KNHJrQQ7M6dxaXtPHHCZ+5xRJcM= tasks: - install: null - exec: mon.a: - yum install -y python3 || apt install -y python3 - workunit: clients: client.0: - cephadm/test_cephadm.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-09_11:23:05 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-09T21:50:59.482 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-09T21:50:59.482 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-09T21:50:59.482 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-09T21:50:59.482 INFO:teuthology.task.internal:Checking packages... 2026-03-09T21:50:59.482 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-09T21:50:59.483 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T21:50:59.483 INFO:teuthology.packaging:ref: None 2026-03-09T21:50:59.483 INFO:teuthology.packaging:tag: None 2026-03-09T21:50:59.483 INFO:teuthology.packaging:branch: squid 2026-03-09T21:50:59.483 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T21:50:59.483 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=squid 2026-03-09T21:51:00.154 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678-ge911bdeb-1jammy 2026-03-09T21:51:00.156 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-09T21:51:00.211 INFO:teuthology.task.internal:no buildpackages task found 2026-03-09T21:51:00.212 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-09T21:51:00.255 INFO:teuthology.task.internal:Saving configuration 2026-03-09T21:51:00.271 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-09T21:51:00.302 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-09T21:51:00.310 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm11.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/668', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 21:50:15.978027', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:0b', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCu52yw/o/RyIYJrtfPQNfS64DG5f7kBbbTE+pdj3iWbYXKiWomdypzDGnV3KNHJrQQ7M6dxaXtPHHCZ+5xRJcM='} 2026-03-09T21:51:00.310 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-09T21:51:00.322 INFO:teuthology.task.internal:roles: ubuntu@vm11.local - ['mon.a', 'mgr.x', 'osd.0', 'client.0'] 2026-03-09T21:51:00.322 INFO:teuthology.run_tasks:Running task console_log... 2026-03-09T21:51:00.349 DEBUG:teuthology.task.console_log:vm11 does not support IPMI; excluding 2026-03-09T21:51:00.350 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7fc921422290>, signals=[15]) 2026-03-09T21:51:00.350 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-09T21:51:00.377 INFO:teuthology.task.internal:Opening connections... 2026-03-09T21:51:00.389 DEBUG:teuthology.task.internal:connecting to ubuntu@vm11.local 2026-03-09T21:51:00.390 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm11.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T21:51:00.453 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-09T21:51:00.461 DEBUG:teuthology.orchestra.run.vm11:> uname -m 2026-03-09T21:51:00.589 INFO:teuthology.orchestra.run.vm11.stdout:x86_64 2026-03-09T21:51:00.589 DEBUG:teuthology.orchestra.run.vm11:> cat /etc/os-release 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:NAME="Ubuntu" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:VERSION_ID="22.04" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:VERSION_CODENAME=jammy 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:ID=ubuntu 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:ID_LIKE=debian 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-09T21:51:00.635 INFO:teuthology.orchestra.run.vm11.stdout:UBUNTU_CODENAME=jammy 2026-03-09T21:51:00.636 INFO:teuthology.lock.ops:Updating vm11.local on lock server 2026-03-09T21:51:00.652 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-09T21:51:00.655 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-09T21:51:00.692 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-09T21:51:00.692 DEBUG:teuthology.orchestra.run.vm11:> test '!' -e /home/ubuntu/cephtest 2026-03-09T21:51:00.695 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-09T21:51:00.699 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-09T21:51:00.699 DEBUG:teuthology.orchestra.run.vm11:> test -z $(ls -A /var/lib/ceph) 2026-03-09T21:51:00.743 INFO:teuthology.orchestra.run.vm11.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T21:51:00.743 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-09T21:51:00.754 DEBUG:teuthology.orchestra.run.vm11:> test -e /ceph-qa-ready 2026-03-09T21:51:00.788 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T21:51:01.195 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-09T21:51:01.196 INFO:teuthology.task.internal:Creating test directory... 2026-03-09T21:51:01.196 DEBUG:teuthology.orchestra.run.vm11:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T21:51:01.199 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-09T21:51:01.214 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-09T21:51:01.219 INFO:teuthology.task.internal:Creating archive directory... 2026-03-09T21:51:01.219 DEBUG:teuthology.orchestra.run.vm11:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T21:51:01.246 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-09T21:51:01.248 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-09T21:51:01.248 DEBUG:teuthology.orchestra.run.vm11:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T21:51:01.289 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T21:51:01.289 DEBUG:teuthology.orchestra.run.vm11:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T21:51:01.349 INFO:teuthology.orchestra.run.vm11.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T21:51:01.356 INFO:teuthology.orchestra.run.vm11.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T21:51:01.357 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-09T21:51:01.358 INFO:teuthology.task.internal:Configuring sudo... 2026-03-09T21:51:01.358 DEBUG:teuthology.orchestra.run.vm11:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T21:51:01.412 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-09T21:51:01.435 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-09T21:51:01.435 DEBUG:teuthology.orchestra.run.vm11:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T21:51:01.457 DEBUG:teuthology.orchestra.run.vm11:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T21:51:01.501 DEBUG:teuthology.orchestra.run.vm11:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T21:51:01.545 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:51:01.546 DEBUG:teuthology.orchestra.run.vm11:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T21:51:01.597 DEBUG:teuthology.orchestra.run.vm11:> sudo service rsyslog restart 2026-03-09T21:51:01.662 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-09T21:51:01.686 INFO:teuthology.task.internal:Starting timer... 2026-03-09T21:51:01.686 INFO:teuthology.run_tasks:Running task pcp... 2026-03-09T21:51:01.699 INFO:teuthology.run_tasks:Running task selinux... 2026-03-09T21:51:01.707 INFO:teuthology.task.selinux:Excluding vm11: VMs are not yet supported 2026-03-09T21:51:01.707 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-09T21:51:01.707 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-09T21:51:01.707 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-09T21:51:01.707 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-09T21:51:01.727 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-09T21:51:01.735 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-09T21:51:01.742 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-09T21:51:01.746 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryftlf_tin --limit vm11.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-09T21:53:34.443 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm11.local')] 2026-03-09T21:53:34.443 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm11.local' 2026-03-09T21:53:34.443 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm11.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T21:53:34.505 DEBUG:teuthology.orchestra.run.vm11:> true 2026-03-09T21:53:34.741 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm11.local' 2026-03-09T21:53:34.741 INFO:teuthology.run_tasks:Running task clock... 2026-03-09T21:53:34.743 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-09T21:53:34.743 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T21:53:34.743 DEBUG:teuthology.orchestra.run.vm11:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T21:53:34.799 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Command line: ntpd -gq 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: ---------------------------------------------------- 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: ntp-4 is maintained by Network Time Foundation, 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: corporation. Support and training for ntp-4 are 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: available at https://www.nwtime.org/support 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: ---------------------------------------------------- 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: proto: precision = 0.029 usec (-25) 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: basedate set to 2022-02-04 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: gps base set to 2022-02-06 (week 2196) 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-09T21:53:34.800 INFO:teuthology.orchestra.run.vm11.stderr: 9 Mar 21:53:34 ntpd[16098]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 72 days ago 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listen and drop on 0 v6wildcard [::]:123 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listen normally on 2 lo 127.0.0.1:123 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listen normally on 3 ens3 192.168.123.111:123 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listen normally on 4 lo [::1]:123 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:b%2]:123 2026-03-09T21:53:34.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:34 ntpd[16098]: Listening on routing socket on fd #22 for interface updates 2026-03-09T21:53:35.801 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:35 ntpd[16098]: Soliciting pool server 93.177.65.20 2026-03-09T21:53:36.799 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:36 ntpd[16098]: Soliciting pool server 134.60.1.30 2026-03-09T21:53:36.800 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:36 ntpd[16098]: Soliciting pool server 172.104.149.161 2026-03-09T21:53:37.799 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:37 ntpd[16098]: Soliciting pool server 18.192.244.117 2026-03-09T21:53:37.799 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:37 ntpd[16098]: Soliciting pool server 157.90.15.187 2026-03-09T21:53:37.952 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:37 ntpd[16098]: Soliciting pool server 217.154.182.60 2026-03-09T21:53:38.798 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:38 ntpd[16098]: Soliciting pool server 139.144.71.56 2026-03-09T21:53:38.798 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:38 ntpd[16098]: Soliciting pool server 51.75.67.47 2026-03-09T21:53:38.798 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:38 ntpd[16098]: Soliciting pool server 85.220.190.246 2026-03-09T21:53:38.798 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:38 ntpd[16098]: Soliciting pool server 217.160.19.219 2026-03-09T21:53:39.797 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:39 ntpd[16098]: Soliciting pool server 194.59.205.229 2026-03-09T21:53:39.797 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:39 ntpd[16098]: Soliciting pool server 185.252.140.125 2026-03-09T21:53:39.798 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:39 ntpd[16098]: Soliciting pool server 79.133.44.139 2026-03-09T21:53:39.799 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:39 ntpd[16098]: Soliciting pool server 185.125.190.56 2026-03-09T21:53:40.797 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:40 ntpd[16098]: Soliciting pool server 185.125.190.58 2026-03-09T21:53:40.797 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:40 ntpd[16098]: Soliciting pool server 152.53.191.142 2026-03-09T21:53:40.797 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:40 ntpd[16098]: Soliciting pool server 77.42.16.222 2026-03-09T21:53:42.824 INFO:teuthology.orchestra.run.vm11.stdout: 9 Mar 21:53:42 ntpd[16098]: ntpd: time slew +0.000189 s 2026-03-09T21:53:42.824 INFO:teuthology.orchestra.run.vm11.stdout:ntpd: time slew +0.000189s 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout: remote refid st t when poll reach delay offset jitter 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout:============================================================================== 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T21:53:42.842 INFO:teuthology.orchestra.run.vm11.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T21:53:42.842 INFO:teuthology.run_tasks:Running task install... 2026-03-09T21:53:42.844 DEBUG:teuthology.task.install:project ceph 2026-03-09T21:53:42.845 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T21:53:42.845 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T21:53:42.845 INFO:teuthology.task.install:Using flavor: default 2026-03-09T21:53:42.847 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-09T21:53:42.847 INFO:teuthology.task.install:extra packages: [] 2026-03-09T21:53:42.847 DEBUG:teuthology.orchestra.run.vm11:> sudo apt-key list | grep Ceph 2026-03-09T21:53:42.918 INFO:teuthology.orchestra.run.vm11.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-09T21:53:42.935 INFO:teuthology.orchestra.run.vm11.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-09T21:53:42.935 INFO:teuthology.orchestra.run.vm11.stdout:uid [ unknown] Ceph.com (release key) 2026-03-09T21:53:42.935 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-09T21:53:42.935 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-xmltodict, python3-jmespath on remote deb x86_64 2026-03-09T21:53:42.935 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T21:53:43.601 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default/ 2026-03-09T21:53:43.601 INFO:teuthology.task.install.deb:Package version is 19.2.3-678-ge911bdeb-1jammy 2026-03-09T21:53:44.133 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:53:44.133 DEBUG:teuthology.orchestra.run.vm11:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-09T21:53:44.141 DEBUG:teuthology.orchestra.run.vm11:> sudo apt-get update 2026-03-09T21:53:44.324 INFO:teuthology.orchestra.run.vm11.stdout:Hit:1 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-09T21:53:44.418 INFO:teuthology.orchestra.run.vm11.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-09T21:53:44.450 INFO:teuthology.orchestra.run.vm11.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-09T21:53:44.482 INFO:teuthology.orchestra.run.vm11.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-09T21:53:44.844 INFO:teuthology.orchestra.run.vm11.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy InRelease 2026-03-09T21:53:44.959 INFO:teuthology.orchestra.run.vm11.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release [7662 B] 2026-03-09T21:53:45.074 INFO:teuthology.orchestra.run.vm11.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-09T21:53:45.188 INFO:teuthology.orchestra.run.vm11.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.1 kB] 2026-03-09T21:53:45.261 INFO:teuthology.orchestra.run.vm11.stdout:Fetched 25.8 kB in 1s (26.5 kB/s) 2026-03-09T21:53:45.946 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T21:53:45.958 DEBUG:teuthology.orchestra.run.vm11:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=19.2.3-678-ge911bdeb-1jammy cephadm=19.2.3-678-ge911bdeb-1jammy ceph-mds=19.2.3-678-ge911bdeb-1jammy ceph-mgr=19.2.3-678-ge911bdeb-1jammy ceph-common=19.2.3-678-ge911bdeb-1jammy ceph-fuse=19.2.3-678-ge911bdeb-1jammy ceph-test=19.2.3-678-ge911bdeb-1jammy ceph-volume=19.2.3-678-ge911bdeb-1jammy radosgw=19.2.3-678-ge911bdeb-1jammy python3-rados=19.2.3-678-ge911bdeb-1jammy python3-rgw=19.2.3-678-ge911bdeb-1jammy python3-cephfs=19.2.3-678-ge911bdeb-1jammy python3-rbd=19.2.3-678-ge911bdeb-1jammy libcephfs2=19.2.3-678-ge911bdeb-1jammy libcephfs-dev=19.2.3-678-ge911bdeb-1jammy librados2=19.2.3-678-ge911bdeb-1jammy librbd1=19.2.3-678-ge911bdeb-1jammy rbd-fuse=19.2.3-678-ge911bdeb-1jammy 2026-03-09T21:53:45.993 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T21:53:46.198 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T21:53:46.198 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T21:53:46.430 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T21:53:46.430 INFO:teuthology.orchestra.run.vm11.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-09T21:53:46.430 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-09T21:53:46.430 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T21:53:46.431 INFO:teuthology.orchestra.run.vm11.stdout:The following additional packages will be installed: 2026-03-09T21:53:46.431 INFO:teuthology.orchestra.run.vm11.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-09T21:53:46.431 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-09T21:53:46.431 INFO:teuthology.orchestra.run.vm11.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-09T21:53:46.431 INFO:teuthology.orchestra.run.vm11.stdout: liboath0 libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-09T21:53:46.431 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsqlite3-mod-ceph 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: libthrift-0.16.0 lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-pytest python3-repoze.lru 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-toml python3-waitress python3-wcwidth python3-webob 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: python3-websocket python3-webtest python3-werkzeug python3-zc.lockfile 2026-03-09T21:53:46.432 INFO:teuthology.orchestra.run.vm11.stdout: qttranslations5-l10n smartmontools socat unzip xmlstarlet zip 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout:Suggested packages: 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: python3-influxdb readline-doc python3-beaker python-mako-doc 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: python-natsort-doc httpd-wsgi libapache2-mod-python libapache2-mod-scgi 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: libjs-mochikit python-pecan-doc python-psutil-doc subversion 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: python-pygments-doc ttf-bitstream-vera python-pyinotify-doc python3-dap 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: python-sklearn-doc ipython3 python-waitress-doc python-webob-doc 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: python-webtest-doc python-werkzeug-doc python3-watchdog gsmartcontrol 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: smart-notifier mailx | mailutils 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout:Recommended packages: 2026-03-09T21:53:46.433 INFO:teuthology.orchestra.run.vm11.stdout: btrfs-tools 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout:The following NEW packages will be installed: 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: libcephfs-dev libcephfs2 libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: liblua5.3-dev libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 libreadline-dev 2026-03-09T21:53:46.472 INFO:teuthology.orchestra.run.vm11.stdout: librgw2 libsqlite3-mod-ceph libthrift-0.16.0 lua-any lua-sec lua-socket 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: lua5.1 luarocks nvme-cli pkg-config python-asyncssh-doc 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-pytest python3-rados python3-rbd 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-repoze.lru python3-requests-oauthlib python3-rgw python3-routes 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-rsa python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-threadpoolctl python3-toml python3-waitress python3-wcwidth 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse smartmontools 2026-03-09T21:53:46.473 INFO:teuthology.orchestra.run.vm11.stdout: socat unzip xmlstarlet zip 2026-03-09T21:53:46.474 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be upgraded: 2026-03-09T21:53:46.474 INFO:teuthology.orchestra.run.vm11.stdout: librados2 librbd1 2026-03-09T21:53:46.941 INFO:teuthology.orchestra.run.vm11.stdout:2 upgraded, 107 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T21:53:46.942 INFO:teuthology.orchestra.run.vm11.stdout:Need to get 178 MB of archives. 2026-03-09T21:53:46.942 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 782 MB of additional disk space will be used. 2026-03-09T21:53:46.942 INFO:teuthology.orchestra.run.vm11.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-09T21:53:47.096 INFO:teuthology.orchestra.run.vm11.stdout:Get:2 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 19.2.3-678-ge911bdeb-1jammy [3257 kB] 2026-03-09T21:53:47.421 INFO:teuthology.orchestra.run.vm11.stdout:Get:3 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-09T21:53:47.436 INFO:teuthology.orchestra.run.vm11.stdout:Get:4 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-09T21:53:47.533 INFO:teuthology.orchestra.run.vm11.stdout:Get:5 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-09T21:53:47.824 INFO:teuthology.orchestra.run.vm11.stdout:Get:6 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-09T21:53:47.833 INFO:teuthology.orchestra.run.vm11.stdout:Get:7 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-09T21:53:47.877 INFO:teuthology.orchestra.run.vm11.stdout:Get:8 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-09T21:53:47.889 INFO:teuthology.orchestra.run.vm11.stdout:Get:9 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-09T21:53:47.892 INFO:teuthology.orchestra.run.vm11.stdout:Get:10 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-09T21:53:47.892 INFO:teuthology.orchestra.run.vm11.stdout:Get:11 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-09T21:53:47.893 INFO:teuthology.orchestra.run.vm11.stdout:Get:12 https://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-09T21:53:47.917 INFO:teuthology.orchestra.run.vm11.stdout:Get:13 https://archive.ubuntu.com/ubuntu jammy/main amd64 libreadline-dev amd64 8.1.2-1 [166 kB] 2026-03-09T21:53:47.923 INFO:teuthology.orchestra.run.vm11.stdout:Get:14 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblua5.3-dev amd64 5.3.6-1build1 [167 kB] 2026-03-09T21:53:47.928 INFO:teuthology.orchestra.run.vm11.stdout:Get:15 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua5.1 amd64 5.1.5-8.1build4 [94.6 kB] 2026-03-09T21:53:47.949 INFO:teuthology.orchestra.run.vm11.stdout:Get:16 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 19.2.3-678-ge911bdeb-1jammy [3597 kB] 2026-03-09T21:53:48.025 INFO:teuthology.orchestra.run.vm11.stdout:Get:17 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-any all 27ubuntu1 [5034 B] 2026-03-09T21:53:48.026 INFO:teuthology.orchestra.run.vm11.stdout:Get:18 https://archive.ubuntu.com/ubuntu jammy/main amd64 zip amd64 3.0-12build2 [176 kB] 2026-03-09T21:53:48.028 INFO:teuthology.orchestra.run.vm11.stdout:Get:19 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 unzip amd64 6.0-26ubuntu3.2 [175 kB] 2026-03-09T21:53:48.031 INFO:teuthology.orchestra.run.vm11.stdout:Get:20 https://archive.ubuntu.com/ubuntu jammy/universe amd64 luarocks all 3.8.0+dfsg1-1 [140 kB] 2026-03-09T21:53:48.033 INFO:teuthology.orchestra.run.vm11.stdout:Get:21 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-09T21:53:48.034 INFO:teuthology.orchestra.run.vm11.stdout:Get:22 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-09T21:53:48.034 INFO:teuthology.orchestra.run.vm11.stdout:Get:23 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-09T21:53:48.035 INFO:teuthology.orchestra.run.vm11.stdout:Get:24 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-09T21:53:48.077 INFO:teuthology.orchestra.run.vm11.stdout:Get:25 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 19.2.3-678-ge911bdeb-1jammy [979 kB] 2026-03-09T21:53:48.089 INFO:teuthology.orchestra.run.vm11.stdout:Get:26 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 19.2.3-678-ge911bdeb-1jammy [357 kB] 2026-03-09T21:53:48.094 INFO:teuthology.orchestra.run.vm11.stdout:Get:27 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 19.2.3-678-ge911bdeb-1jammy [32.9 kB] 2026-03-09T21:53:48.095 INFO:teuthology.orchestra.run.vm11.stdout:Get:28 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 19.2.3-678-ge911bdeb-1jammy [184 kB] 2026-03-09T21:53:48.098 INFO:teuthology.orchestra.run.vm11.stdout:Get:29 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 19.2.3-678-ge911bdeb-1jammy [70.1 kB] 2026-03-09T21:53:48.099 INFO:teuthology.orchestra.run.vm11.stdout:Get:30 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 19.2.3-678-ge911bdeb-1jammy [334 kB] 2026-03-09T21:53:48.105 INFO:teuthology.orchestra.run.vm11.stdout:Get:31 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 19.2.3-678-ge911bdeb-1jammy [6935 kB] 2026-03-09T21:53:48.135 INFO:teuthology.orchestra.run.vm11.stdout:Get:32 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-09T21:53:48.146 INFO:teuthology.orchestra.run.vm11.stdout:Get:33 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-09T21:53:48.146 INFO:teuthology.orchestra.run.vm11.stdout:Get:34 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-09T21:53:48.146 INFO:teuthology.orchestra.run.vm11.stdout:Get:35 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-09T21:53:48.146 INFO:teuthology.orchestra.run.vm11.stdout:Get:36 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-09T21:53:48.238 INFO:teuthology.orchestra.run.vm11.stdout:Get:37 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-09T21:53:48.241 INFO:teuthology.orchestra.run.vm11.stdout:Get:38 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-09T21:53:48.241 INFO:teuthology.orchestra.run.vm11.stdout:Get:39 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-logutils all 0.3.3-8 [17.6 kB] 2026-03-09T21:53:48.241 INFO:teuthology.orchestra.run.vm11.stdout:Get:40 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-mako all 1.1.3+ds1-2ubuntu0.1 [60.5 kB] 2026-03-09T21:53:48.242 INFO:teuthology.orchestra.run.vm11.stdout:Get:41 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplegeneric all 0.8.1-3 [11.3 kB] 2026-03-09T21:53:48.340 INFO:teuthology.orchestra.run.vm11.stdout:Get:42 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-singledispatch all 3.4.0.3-3 [7320 B] 2026-03-09T21:53:48.340 INFO:teuthology.orchestra.run.vm11.stdout:Get:43 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-09T21:53:48.341 INFO:teuthology.orchestra.run.vm11.stdout:Get:44 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-waitress all 1.4.4-1.1ubuntu1.1 [47.0 kB] 2026-03-09T21:53:48.341 INFO:teuthology.orchestra.run.vm11.stdout:Get:45 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempita all 0.5.2-6ubuntu1 [15.1 kB] 2026-03-09T21:53:48.342 INFO:teuthology.orchestra.run.vm11.stdout:Get:46 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-paste all 3.5.0+dfsg1-1 [456 kB] 2026-03-09T21:53:48.437 INFO:teuthology.orchestra.run.vm11.stdout:Get:47 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 19.2.3-678-ge911bdeb-1jammy [112 kB] 2026-03-09T21:53:48.440 INFO:teuthology.orchestra.run.vm11.stdout:Get:48 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 19.2.3-678-ge911bdeb-1jammy [470 kB] 2026-03-09T21:53:48.442 INFO:teuthology.orchestra.run.vm11.stdout:Get:49 https://archive.ubuntu.com/ubuntu jammy/main amd64 python-pastedeploy-tpl all 2.1.1-1 [4892 B] 2026-03-09T21:53:48.442 INFO:teuthology.orchestra.run.vm11.stdout:Get:50 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastedeploy all 2.1.1-1 [26.6 kB] 2026-03-09T21:53:48.443 INFO:teuthology.orchestra.run.vm11.stdout:Get:51 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-webtest all 2.0.35-1 [28.5 kB] 2026-03-09T21:53:48.443 INFO:teuthology.orchestra.run.vm11.stdout:Get:52 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pecan all 1.3.3-4ubuntu2 [87.3 kB] 2026-03-09T21:53:48.447 INFO:teuthology.orchestra.run.vm11.stdout:Get:53 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 19.2.3-678-ge911bdeb-1jammy [26.5 MB] 2026-03-09T21:53:48.447 INFO:teuthology.orchestra.run.vm11.stdout:Get:54 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-werkzeug all 2.0.2+dfsg1-1ubuntu0.22.04.3 [181 kB] 2026-03-09T21:53:48.544 INFO:teuthology.orchestra.run.vm11.stdout:Get:55 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-09T21:53:48.546 INFO:teuthology.orchestra.run.vm11.stdout:Get:56 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-09T21:53:48.548 INFO:teuthology.orchestra.run.vm11.stdout:Get:57 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-09T21:53:48.549 INFO:teuthology.orchestra.run.vm11.stdout:Get:58 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-09T21:53:48.549 INFO:teuthology.orchestra.run.vm11.stdout:Get:59 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-09T21:53:48.719 INFO:teuthology.orchestra.run.vm11.stdout:Get:60 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-09T21:53:48.721 INFO:teuthology.orchestra.run.vm11.stdout:Get:61 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-09T21:53:48.721 INFO:teuthology.orchestra.run.vm11.stdout:Get:62 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-09T21:53:48.729 INFO:teuthology.orchestra.run.vm11.stdout:Get:63 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-09T21:53:48.729 INFO:teuthology.orchestra.run.vm11.stdout:Get:64 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-09T21:53:48.750 INFO:teuthology.orchestra.run.vm11.stdout:Get:65 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-09T21:53:48.750 INFO:teuthology.orchestra.run.vm11.stdout:Get:66 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-09T21:53:48.750 INFO:teuthology.orchestra.run.vm11.stdout:Get:67 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-09T21:53:48.751 INFO:teuthology.orchestra.run.vm11.stdout:Get:68 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-09T21:53:48.851 INFO:teuthology.orchestra.run.vm11.stdout:Get:69 https://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-09T21:53:48.853 INFO:teuthology.orchestra.run.vm11.stdout:Get:70 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-09T21:53:48.855 INFO:teuthology.orchestra.run.vm11.stdout:Get:71 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-09T21:53:48.855 INFO:teuthology.orchestra.run.vm11.stdout:Get:72 https://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-09T21:53:48.860 INFO:teuthology.orchestra.run.vm11.stdout:Get:73 https://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-09T21:53:48.864 INFO:teuthology.orchestra.run.vm11.stdout:Get:74 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-socket amd64 3.0~rc1+git+ac3201d-6 [78.9 kB] 2026-03-09T21:53:48.953 INFO:teuthology.orchestra.run.vm11.stdout:Get:75 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-sec amd64 1.0.2-1 [37.6 kB] 2026-03-09T21:53:48.954 INFO:teuthology.orchestra.run.vm11.stdout:Get:76 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-09T21:53:48.960 INFO:teuthology.orchestra.run.vm11.stdout:Get:77 https://archive.ubuntu.com/ubuntu jammy/main amd64 pkg-config amd64 0.29.2-1ubuntu3 [48.2 kB] 2026-03-09T21:53:48.960 INFO:teuthology.orchestra.run.vm11.stdout:Get:78 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-09T21:53:49.056 INFO:teuthology.orchestra.run.vm11.stdout:Get:79 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-09T21:53:49.056 INFO:teuthology.orchestra.run.vm11.stdout:Get:80 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastescript all 2.0.2-4 [54.6 kB] 2026-03-09T21:53:49.057 INFO:teuthology.orchestra.run.vm11.stdout:Get:81 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-09T21:53:49.057 INFO:teuthology.orchestra.run.vm11.stdout:Get:82 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-09T21:53:49.059 INFO:teuthology.orchestra.run.vm11.stdout:Get:83 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-09T21:53:49.060 INFO:teuthology.orchestra.run.vm11.stdout:Get:84 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-09T21:53:49.160 INFO:teuthology.orchestra.run.vm11.stdout:Get:85 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pyinotify all 0.9.6-1.3 [24.8 kB] 2026-03-09T21:53:49.160 INFO:teuthology.orchestra.run.vm11.stdout:Get:86 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-09T21:53:49.161 INFO:teuthology.orchestra.run.vm11.stdout:Get:87 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-09T21:53:49.164 INFO:teuthology.orchestra.run.vm11.stdout:Get:88 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-09T21:53:49.262 INFO:teuthology.orchestra.run.vm11.stdout:Get:89 https://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-09T21:53:49.432 INFO:teuthology.orchestra.run.vm11.stdout:Get:90 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-09T21:53:49.822 INFO:teuthology.orchestra.run.vm11.stdout:Get:91 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 19.2.3-678-ge911bdeb-1jammy [5178 kB] 2026-03-09T21:53:49.973 INFO:teuthology.orchestra.run.vm11.stdout:Get:92 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 19.2.3-678-ge911bdeb-1jammy [248 kB] 2026-03-09T21:53:49.986 INFO:teuthology.orchestra.run.vm11.stdout:Get:93 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 19.2.3-678-ge911bdeb-1jammy [125 kB] 2026-03-09T21:53:50.054 INFO:teuthology.orchestra.run.vm11.stdout:Get:94 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 19.2.3-678-ge911bdeb-1jammy [1081 kB] 2026-03-09T21:53:50.071 INFO:teuthology.orchestra.run.vm11.stdout:Get:95 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 19.2.3-678-ge911bdeb-1jammy [6239 kB] 2026-03-09T21:53:50.319 INFO:teuthology.orchestra.run.vm11.stdout:Get:96 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 19.2.3-678-ge911bdeb-1jammy [23.0 MB] 2026-03-09T21:53:51.198 INFO:teuthology.orchestra.run.vm11.stdout:Get:97 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 19.2.3-678-ge911bdeb-1jammy [14.2 kB] 2026-03-09T21:53:51.198 INFO:teuthology.orchestra.run.vm11.stdout:Get:98 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 19.2.3-678-ge911bdeb-1jammy [1173 kB] 2026-03-09T21:53:51.287 INFO:teuthology.orchestra.run.vm11.stdout:Get:99 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 19.2.3-678-ge911bdeb-1jammy [2503 kB] 2026-03-09T21:53:51.389 INFO:teuthology.orchestra.run.vm11.stdout:Get:100 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 19.2.3-678-ge911bdeb-1jammy [798 kB] 2026-03-09T21:53:51.412 INFO:teuthology.orchestra.run.vm11.stdout:Get:101 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 19.2.3-678-ge911bdeb-1jammy [157 kB] 2026-03-09T21:53:51.413 INFO:teuthology.orchestra.run.vm11.stdout:Get:102 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 19.2.3-678-ge911bdeb-1jammy [2396 kB] 2026-03-09T21:53:51.511 INFO:teuthology.orchestra.run.vm11.stdout:Get:103 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 19.2.3-678-ge911bdeb-1jammy [8625 kB] 2026-03-09T21:53:51.866 INFO:teuthology.orchestra.run.vm11.stdout:Get:104 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 19.2.3-678-ge911bdeb-1jammy [14.3 kB] 2026-03-09T21:53:51.866 INFO:teuthology.orchestra.run.vm11.stdout:Get:105 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 19.2.3-678-ge911bdeb-1jammy [52.1 MB] 2026-03-09T21:53:54.507 INFO:teuthology.orchestra.run.vm11.stdout:Get:106 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 19.2.3-678-ge911bdeb-1jammy [135 kB] 2026-03-09T21:53:54.507 INFO:teuthology.orchestra.run.vm11.stdout:Get:107 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 19.2.3-678-ge911bdeb-1jammy [41.0 kB] 2026-03-09T21:53:54.507 INFO:teuthology.orchestra.run.vm11.stdout:Get:108 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 19.2.3-678-ge911bdeb-1jammy [13.7 MB] 2026-03-09T21:53:55.159 INFO:teuthology.orchestra.run.vm11.stdout:Get:109 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 19.2.3-678-ge911bdeb-1jammy [92.2 kB] 2026-03-09T21:53:55.507 INFO:teuthology.orchestra.run.vm11.stdout:Fetched 178 MB in 9s (20.5 MB/s) 2026-03-09T21:53:55.752 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-09T21:53:55.793 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 111717 files and directories currently installed.) 2026-03-09T21:53:55.795 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../000-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-09T21:53:55.797 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-09T21:53:55.819 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-09T21:53:55.825 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../001-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-09T21:53:55.826 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-09T21:53:55.843 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-09T21:53:55.849 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../002-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-09T21:53:55.850 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-09T21:53:55.873 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-09T21:53:55.879 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../003-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-09T21:53:55.889 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T21:53:55.948 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-09T21:53:55.948 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../004-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-09T21:53:55.949 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T21:53:55.966 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-09T21:53:55.971 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../005-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-09T21:53:55.972 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T21:53:55.998 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-09T21:53:56.004 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../006-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-09T21:53:56.005 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-09T21:53:56.029 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../007-librbd1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:56.032 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking librbd1 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-09T21:53:56.462 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../008-librados2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:56.558 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking librados2 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-09T21:53:56.658 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libnbd0. 2026-03-09T21:53:56.663 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../009-libnbd0_1.10.5-1_amd64.deb ... 2026-03-09T21:53:56.664 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-09T21:53:56.677 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libcephfs2. 2026-03-09T21:53:56.681 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../010-libcephfs2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:56.682 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:56.707 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-rados. 2026-03-09T21:53:56.711 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../011-python3-rados_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:56.712 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:56.731 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-09T21:53:56.736 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../012-python3-ceph-argparse_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:53:56.737 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:56.749 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-cephfs. 2026-03-09T21:53:56.754 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../013-python3-cephfs_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:56.755 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:56.774 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-09T21:53:56.779 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../014-python3-ceph-common_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:53:56.780 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:56.799 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-09T21:53:56.804 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../015-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-09T21:53:56.805 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-09T21:53:56.821 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-prettytable. 2026-03-09T21:53:56.826 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../016-python3-prettytable_2.5.0-2_all.deb ... 2026-03-09T21:53:56.827 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-09T21:53:56.842 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-rbd. 2026-03-09T21:53:56.847 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../017-python3-rbd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:56.848 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:56.870 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-09T21:53:56.876 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../018-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-09T21:53:56.877 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-09T21:53:56.898 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libreadline-dev:amd64. 2026-03-09T21:53:56.904 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../019-libreadline-dev_8.1.2-1_amd64.deb ... 2026-03-09T21:53:56.905 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libreadline-dev:amd64 (8.1.2-1) ... 2026-03-09T21:53:56.922 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package liblua5.3-dev:amd64. 2026-03-09T21:53:56.927 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../020-liblua5.3-dev_5.3.6-1build1_amd64.deb ... 2026-03-09T21:53:56.928 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-09T21:53:56.947 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package lua5.1. 2026-03-09T21:53:56.952 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../021-lua5.1_5.1.5-8.1build4_amd64.deb ... 2026-03-09T21:53:56.953 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking lua5.1 (5.1.5-8.1build4) ... 2026-03-09T21:53:56.971 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package lua-any. 2026-03-09T21:53:56.976 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../022-lua-any_27ubuntu1_all.deb ... 2026-03-09T21:53:56.977 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking lua-any (27ubuntu1) ... 2026-03-09T21:53:56.989 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package zip. 2026-03-09T21:53:56.994 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../023-zip_3.0-12build2_amd64.deb ... 2026-03-09T21:53:56.995 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking zip (3.0-12build2) ... 2026-03-09T21:53:57.010 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package unzip. 2026-03-09T21:53:57.016 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../024-unzip_6.0-26ubuntu3.2_amd64.deb ... 2026-03-09T21:53:57.017 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking unzip (6.0-26ubuntu3.2) ... 2026-03-09T21:53:57.038 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package luarocks. 2026-03-09T21:53:57.044 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../025-luarocks_3.8.0+dfsg1-1_all.deb ... 2026-03-09T21:53:57.045 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking luarocks (3.8.0+dfsg1-1) ... 2026-03-09T21:53:57.094 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package librgw2. 2026-03-09T21:53:57.099 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../026-librgw2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:57.100 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:57.419 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-rgw. 2026-03-09T21:53:57.420 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../027-python3-rgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:57.435 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:57.453 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-09T21:53:57.459 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../028-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-09T21:53:57.460 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-09T21:53:57.475 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libradosstriper1. 2026-03-09T21:53:57.481 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../029-libradosstriper1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:57.482 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:57.507 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-common. 2026-03-09T21:53:57.513 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../030-ceph-common_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:57.513 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:58.102 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-base. 2026-03-09T21:53:58.107 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../031-ceph-base_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:58.111 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:58.239 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-09T21:53:58.244 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../032-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-09T21:53:58.245 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-09T21:53:58.261 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-cheroot. 2026-03-09T21:53:58.267 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../033-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-09T21:53:58.268 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-09T21:53:58.289 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-09T21:53:58.295 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../034-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-09T21:53:58.296 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-09T21:53:58.312 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-09T21:53:58.317 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../035-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-09T21:53:58.318 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-09T21:53:58.338 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-09T21:53:58.343 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../036-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-09T21:53:58.343 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-09T21:53:58.362 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-tempora. 2026-03-09T21:53:58.368 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../037-python3-tempora_4.1.2-1_all.deb ... 2026-03-09T21:53:58.368 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-09T21:53:58.619 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-portend. 2026-03-09T21:53:58.624 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../038-python3-portend_3.0.0-1_all.deb ... 2026-03-09T21:53:58.660 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-09T21:53:58.699 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-09T21:53:58.703 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../039-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-09T21:53:58.704 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-09T21:53:58.721 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-09T21:53:58.727 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../040-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-09T21:53:58.727 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-09T21:53:58.755 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-natsort. 2026-03-09T21:53:58.760 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../041-python3-natsort_8.0.2-1_all.deb ... 2026-03-09T21:53:58.761 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-09T21:53:58.777 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-logutils. 2026-03-09T21:53:58.782 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../042-python3-logutils_0.3.3-8_all.deb ... 2026-03-09T21:53:58.782 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-logutils (0.3.3-8) ... 2026-03-09T21:53:58.797 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-mako. 2026-03-09T21:53:58.802 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../043-python3-mako_1.1.3+ds1-2ubuntu0.1_all.deb ... 2026-03-09T21:53:58.803 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-09T21:53:58.820 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-simplegeneric. 2026-03-09T21:53:58.825 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../044-python3-simplegeneric_0.8.1-3_all.deb ... 2026-03-09T21:53:58.826 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-simplegeneric (0.8.1-3) ... 2026-03-09T21:53:58.840 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-singledispatch. 2026-03-09T21:53:58.845 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../045-python3-singledispatch_3.4.0.3-3_all.deb ... 2026-03-09T21:53:58.846 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-singledispatch (3.4.0.3-3) ... 2026-03-09T21:53:58.858 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-webob. 2026-03-09T21:53:58.863 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../046-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-09T21:53:58.864 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-09T21:53:58.882 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-waitress. 2026-03-09T21:53:58.887 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../047-python3-waitress_1.4.4-1.1ubuntu1.1_all.deb ... 2026-03-09T21:53:58.888 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-09T21:53:58.904 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-tempita. 2026-03-09T21:53:58.909 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../048-python3-tempita_0.5.2-6ubuntu1_all.deb ... 2026-03-09T21:53:58.909 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-tempita (0.5.2-6ubuntu1) ... 2026-03-09T21:53:58.923 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-paste. 2026-03-09T21:53:58.928 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../049-python3-paste_3.5.0+dfsg1-1_all.deb ... 2026-03-09T21:53:58.929 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-paste (3.5.0+dfsg1-1) ... 2026-03-09T21:53:58.963 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python-pastedeploy-tpl. 2026-03-09T21:53:58.969 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../050-python-pastedeploy-tpl_2.1.1-1_all.deb ... 2026-03-09T21:53:58.970 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python-pastedeploy-tpl (2.1.1-1) ... 2026-03-09T21:53:58.986 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pastedeploy. 2026-03-09T21:53:58.993 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../051-python3-pastedeploy_2.1.1-1_all.deb ... 2026-03-09T21:53:58.994 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pastedeploy (2.1.1-1) ... 2026-03-09T21:53:59.012 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-webtest. 2026-03-09T21:53:59.018 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../052-python3-webtest_2.0.35-1_all.deb ... 2026-03-09T21:53:59.019 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-webtest (2.0.35-1) ... 2026-03-09T21:53:59.036 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pecan. 2026-03-09T21:53:59.042 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../053-python3-pecan_1.3.3-4ubuntu2_all.deb ... 2026-03-09T21:53:59.043 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pecan (1.3.3-4ubuntu2) ... 2026-03-09T21:53:59.076 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-werkzeug. 2026-03-09T21:53:59.081 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../054-python3-werkzeug_2.0.2+dfsg1-1ubuntu0.22.04.3_all.deb ... 2026-03-09T21:53:59.082 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-09T21:53:59.104 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-09T21:53:59.110 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../055-ceph-mgr-modules-core_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:53:59.110 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.168 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-09T21:53:59.173 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../056-libsqlite3-mod-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.174 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.188 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mgr. 2026-03-09T21:53:59.194 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../057-ceph-mgr_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.194 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.229 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mon. 2026-03-09T21:53:59.234 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../058-ceph-mon_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.234 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.355 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-09T21:53:59.360 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../059-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-09T21:53:59.361 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-09T21:53:59.378 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-osd. 2026-03-09T21:53:59.384 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../060-ceph-osd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.384 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.764 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph. 2026-03-09T21:53:59.771 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../061-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.771 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.785 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-fuse. 2026-03-09T21:53:59.792 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../062-ceph-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.793 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.827 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mds. 2026-03-09T21:53:59.833 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../063-ceph-mds_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.834 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.888 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package cephadm. 2026-03-09T21:53:59.895 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../064-cephadm_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:53:59.895 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.913 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-09T21:53:59.918 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../065-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-09T21:53:59.919 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-09T21:53:59.947 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-09T21:53:59.951 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../066-ceph-mgr-cephadm_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:53:59.952 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:53:59.979 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-09T21:53:59.982 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../067-python3-repoze.lru_0.7-2_all.deb ... 2026-03-09T21:53:59.983 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-09T21:54:00.003 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-routes. 2026-03-09T21:54:00.008 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../068-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-09T21:54:00.009 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-09T21:54:00.043 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-09T21:54:00.044 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../069-ceph-mgr-dashboard_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:54:00.045 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:00.551 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-09T21:54:00.557 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../070-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-09T21:54:00.558 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-09T21:54:00.647 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-joblib. 2026-03-09T21:54:00.652 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../071-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-09T21:54:00.653 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-09T21:54:00.687 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-09T21:54:00.692 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../072-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-09T21:54:00.693 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-09T21:54:00.709 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-sklearn. 2026-03-09T21:54:00.715 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../073-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-09T21:54:00.716 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-09T21:54:00.865 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-09T21:54:00.871 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../074-ceph-mgr-diskprediction-local_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:54:00.872 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:01.407 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-cachetools. 2026-03-09T21:54:01.409 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../075-python3-cachetools_5.0.0-1_all.deb ... 2026-03-09T21:54:01.411 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-09T21:54:01.427 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-rsa. 2026-03-09T21:54:01.431 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../076-python3-rsa_4.8-1_all.deb ... 2026-03-09T21:54:01.432 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-09T21:54:01.451 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-google-auth. 2026-03-09T21:54:01.457 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../077-python3-google-auth_1.5.1-3_all.deb ... 2026-03-09T21:54:01.458 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-09T21:54:01.478 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-09T21:54:01.484 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../078-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-09T21:54:01.485 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-09T21:54:01.505 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-websocket. 2026-03-09T21:54:01.511 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../079-python3-websocket_1.2.3-1_all.deb ... 2026-03-09T21:54:01.512 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-09T21:54:01.533 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-09T21:54:01.538 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../080-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-09T21:54:01.554 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-09T21:54:01.714 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-09T21:54:01.720 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../081-ceph-mgr-k8sevents_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:54:01.720 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:01.736 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-09T21:54:01.741 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../082-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-09T21:54:01.742 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-09T21:54:01.759 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-09T21:54:01.764 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../083-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-09T21:54:01.765 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-09T21:54:01.781 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package jq. 2026-03-09T21:54:01.786 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../084-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-09T21:54:01.787 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-09T21:54:01.801 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package socat. 2026-03-09T21:54:01.806 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../085-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-09T21:54:01.807 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-09T21:54:01.830 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package xmlstarlet. 2026-03-09T21:54:01.835 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../086-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-09T21:54:01.836 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-09T21:54:01.880 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-test. 2026-03-09T21:54:01.886 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../087-ceph-test_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:54:01.886 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:03.222 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package ceph-volume. 2026-03-09T21:54:03.229 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../088-ceph-volume_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-09T21:54:03.230 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:03.260 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-09T21:54:03.266 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../089-libcephfs-dev_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:54:03.267 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:03.286 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package lua-socket:amd64. 2026-03-09T21:54:03.292 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../090-lua-socket_3.0~rc1+git+ac3201d-6_amd64.deb ... 2026-03-09T21:54:03.293 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-09T21:54:03.320 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package lua-sec:amd64. 2026-03-09T21:54:03.327 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../091-lua-sec_1.0.2-1_amd64.deb ... 2026-03-09T21:54:03.328 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking lua-sec:amd64 (1.0.2-1) ... 2026-03-09T21:54:03.349 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package nvme-cli. 2026-03-09T21:54:03.356 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../092-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-09T21:54:03.357 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-09T21:54:03.399 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package pkg-config. 2026-03-09T21:54:03.405 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../093-pkg-config_0.29.2-1ubuntu3_amd64.deb ... 2026-03-09T21:54:03.406 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking pkg-config (0.29.2-1ubuntu3) ... 2026-03-09T21:54:03.424 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-09T21:54:03.430 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../094-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-09T21:54:03.431 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-09T21:54:03.482 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-09T21:54:03.488 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../095-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-09T21:54:03.489 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-09T21:54:03.506 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pastescript. 2026-03-09T21:54:03.512 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../096-python3-pastescript_2.0.2-4_all.deb ... 2026-03-09T21:54:03.513 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pastescript (2.0.2-4) ... 2026-03-09T21:54:03.535 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pluggy. 2026-03-09T21:54:03.543 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../097-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-09T21:54:03.544 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-09T21:54:03.564 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-psutil. 2026-03-09T21:54:03.570 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../098-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-09T21:54:03.571 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-09T21:54:03.594 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-py. 2026-03-09T21:54:03.600 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../099-python3-py_1.10.0-1_all.deb ... 2026-03-09T21:54:03.601 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-09T21:54:03.626 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pygments. 2026-03-09T21:54:03.632 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../100-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-09T21:54:03.632 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-09T21:54:03.722 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pyinotify. 2026-03-09T21:54:03.728 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../101-python3-pyinotify_0.9.6-1.3_all.deb ... 2026-03-09T21:54:03.729 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pyinotify (0.9.6-1.3) ... 2026-03-09T21:54:03.747 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-toml. 2026-03-09T21:54:03.754 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../102-python3-toml_0.10.2-1_all.deb ... 2026-03-09T21:54:03.754 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-09T21:54:03.769 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-pytest. 2026-03-09T21:54:03.775 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../103-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-09T21:54:03.775 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-09T21:54:03.803 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-simplejson. 2026-03-09T21:54:03.809 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../104-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-09T21:54:03.810 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-09T21:54:03.830 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-09T21:54:03.836 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../105-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-09T21:54:03.837 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-09T21:54:03.981 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package radosgw. 2026-03-09T21:54:03.987 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../106-radosgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:54:03.988 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:04.398 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package rbd-fuse. 2026-03-09T21:54:04.403 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../107-rbd-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-09T21:54:04.404 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:04.422 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package smartmontools. 2026-03-09T21:54:04.428 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../108-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-09T21:54:04.435 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-09T21:54:04.480 INFO:teuthology.orchestra.run.vm11.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-09T21:54:04.735 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-09T21:54:04.735 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-09T21:54:05.131 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-09T21:54:05.200 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-09T21:54:05.203 INFO:teuthology.orchestra.run.vm11.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-09T21:54:05.264 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-09T21:54:05.534 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-09T21:54:05.936 INFO:teuthology.orchestra.run.vm11.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-09T21:54:05.942 INFO:teuthology.orchestra.run.vm11.stdout:Could not execute systemctl: at /usr/bin/deb-systemd-invoke line 142. 2026-03-09T21:54:05.944 INFO:teuthology.orchestra.run.vm11.stdout:Setting up cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:05.986 INFO:teuthology.orchestra.run.vm11.stdout:Adding system user cephadm....done 2026-03-09T21:54:05.995 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-09T21:54:06.073 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-09T21:54:06.139 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-09T21:54:06.142 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-09T21:54:06.202 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-09T21:54:06.275 INFO:teuthology.orchestra.run.vm11.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-09T21:54:06.279 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-09T21:54:06.370 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-09T21:54:06.491 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-09T21:54:06.563 INFO:teuthology.orchestra.run.vm11.stdout:Setting up unzip (6.0-26ubuntu3.2) ... 2026-03-09T21:54:06.571 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pyinotify (0.9.6-1.3) ... 2026-03-09T21:54:06.640 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-09T21:54:06.706 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:06.855 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-09T21:54:06.988 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-09T21:54:06.992 INFO:teuthology.orchestra.run.vm11.stdout:Setting up lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-09T21:54:06.995 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libreadline-dev:amd64 (8.1.2-1) ... 2026-03-09T21:54:06.997 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-09T21:54:07.000 INFO:teuthology.orchestra.run.vm11.stdout:Setting up lua5.1 (5.1.5-8.1build4) ... 2026-03-09T21:54:07.004 INFO:teuthology.orchestra.run.vm11.stdout:update-alternatives: using /usr/bin/lua5.1 to provide /usr/bin/lua (lua-interpreter) in auto mode 2026-03-09T21:54:07.006 INFO:teuthology.orchestra.run.vm11.stdout:update-alternatives: using /usr/bin/luac5.1 to provide /usr/bin/luac (lua-compiler) in auto mode 2026-03-09T21:54:07.008 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-09T21:54:07.011 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-09T21:54:07.129 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-09T21:54:07.200 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-09T21:54:07.268 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-09T21:54:07.345 INFO:teuthology.orchestra.run.vm11.stdout:Setting up zip (3.0-12build2) ... 2026-03-09T21:54:07.348 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-09T21:54:07.619 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-tempita (0.5.2-6ubuntu1) ... 2026-03-09T21:54:07.796 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python-pastedeploy-tpl (2.1.1-1) ... 2026-03-09T21:54:07.915 INFO:teuthology.orchestra.run.vm11.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-09T21:54:07.918 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-09T21:54:08.020 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-09T21:54:08.196 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-paste (3.5.0+dfsg1-1) ... 2026-03-09T21:54:08.419 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-09T21:54:08.510 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-09T21:54:08.626 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-09T21:54:08.691 INFO:teuthology.orchestra.run.vm11.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-09T21:54:08.693 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:08.784 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-09T21:54:09.371 INFO:teuthology.orchestra.run.vm11.stdout:Setting up pkg-config (0.29.2-1ubuntu3) ... 2026-03-09T21:54:09.393 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T21:54:09.398 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-09T21:54:09.468 INFO:teuthology.orchestra.run.vm11.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-09T21:54:09.471 INFO:teuthology.orchestra.run.vm11.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-09T21:54:09.474 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-09T21:54:09.541 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-09T21:54:09.603 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T21:54:09.606 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-09T21:54:09.685 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-singledispatch (3.4.0.3-3) ... 2026-03-09T21:54:09.751 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-logutils (0.3.3-8) ... 2026-03-09T21:54:09.821 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-09T21:54:09.893 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-simplegeneric (0.8.1-3) ... 2026-03-09T21:54:09.968 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-09T21:54:10.142 INFO:teuthology.orchestra.run.vm11.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-09T21:54:10.241 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-09T21:54:10.322 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-09T21:54:10.325 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-09T21:54:10.395 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-09T21:54:10.479 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-09T21:54:10.570 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-09T21:54:10.638 INFO:teuthology.orchestra.run.vm11.stdout:Setting up liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-09T21:54:10.640 INFO:teuthology.orchestra.run.vm11.stdout:Setting up lua-sec:amd64 (1.0.2-1) ... 2026-03-09T21:54:10.642 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-09T21:54:10.645 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-09T21:54:10.781 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pastedeploy (2.1.1-1) ... 2026-03-09T21:54:10.856 INFO:teuthology.orchestra.run.vm11.stdout:Setting up lua-any (27ubuntu1) ... 2026-03-09T21:54:10.859 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-09T21:54:10.927 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T21:54:10.929 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-09T21:54:11.005 INFO:teuthology.orchestra.run.vm11.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-09T21:54:11.008 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-webtest (2.0.35-1) ... 2026-03-09T21:54:11.079 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-09T21:54:11.207 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pastescript (2.0.2-4) ... 2026-03-09T21:54:11.295 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-pecan (1.3.3-4ubuntu2) ... 2026-03-09T21:54:11.412 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-09T21:54:11.415 INFO:teuthology.orchestra.run.vm11.stdout:Setting up librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:11.417 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:11.420 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-09T21:54:12.113 INFO:teuthology.orchestra.run.vm11.stdout:Setting up luarocks (3.8.0+dfsg1-1) ... 2026-03-09T21:54:12.142 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.144 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.147 INFO:teuthology.orchestra.run.vm11.stdout:Setting up librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.150 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.153 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.220 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-09T21:54:12.220 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-09T21:54:12.573 INFO:teuthology.orchestra.run.vm11.stdout:Setting up libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.575 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.579 INFO:teuthology.orchestra.run.vm11.stdout:Setting up librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.587 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.589 INFO:teuthology.orchestra.run.vm11.stdout:Setting up rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.695 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.847 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.883 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:12.934 INFO:teuthology.orchestra.run.vm11.stdout:Adding group ceph....done 2026-03-09T21:54:12.967 INFO:teuthology.orchestra.run.vm11.stdout:Adding system user ceph....done 2026-03-09T21:54:12.976 INFO:teuthology.orchestra.run.vm11.stdout:Setting system user ceph properties....done 2026-03-09T21:54:12.980 INFO:teuthology.orchestra.run.vm11.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-09T21:54:13.051 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-09T21:54:13.322 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-09T21:54:13.719 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:13.722 INFO:teuthology.orchestra.run.vm11.stdout:Setting up radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:13.946 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-09T21:54:13.947 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-09T21:54:14.338 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:14.428 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-09T21:54:14.838 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:14.908 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-09T21:54:14.908 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-09T21:54:15.286 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:15.352 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-09T21:54:15.352 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-09T21:54:15.853 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.021 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-09T21:54:16.021 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-09T21:54:16.419 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.422 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.439 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.502 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-09T21:54:16.502 INFO:teuthology.orchestra.run.vm11.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-09T21:54:16.916 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.931 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.933 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:16.950 INFO:teuthology.orchestra.run.vm11.stdout:Setting up ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T21:54:17.080 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-09T21:54:17.088 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-09T21:54:17.104 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T21:54:17.187 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for install-info (6.8-4build1) ... 2026-03-09T21:54:17.661 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:17.661 INFO:teuthology.orchestra.run.vm11.stdout:Running kernel seems to be up-to-date. 2026-03-09T21:54:17.661 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:17.661 INFO:teuthology.orchestra.run.vm11.stdout:Services to be restarted: 2026-03-09T21:54:17.669 INFO:teuthology.orchestra.run.vm11.stdout: systemctl restart packagekit.service 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout:Service restarts being deferred: 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout: systemctl restart networkd-dispatcher.service 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout: systemctl restart unattended-upgrades.service 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout:No containers need to be restarted. 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout:No user sessions are running outdated binaries. 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:17.672 INFO:teuthology.orchestra.run.vm11.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-09T21:54:18.655 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T21:54:18.658 DEBUG:teuthology.orchestra.run.vm11:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-xmltodict python3-jmespath 2026-03-09T21:54:18.731 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T21:54:18.907 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T21:54:18.907 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T21:54:19.083 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T21:54:19.083 INFO:teuthology.orchestra.run.vm11.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-09T21:54:19.083 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-09T21:54:19.083 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T21:54:19.094 INFO:teuthology.orchestra.run.vm11.stdout:The following NEW packages will be installed: 2026-03-09T21:54:19.094 INFO:teuthology.orchestra.run.vm11.stdout: python3-jmespath python3-xmltodict 2026-03-09T21:54:19.183 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 2 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T21:54:19.183 INFO:teuthology.orchestra.run.vm11.stdout:Need to get 34.3 kB of archives. 2026-03-09T21:54:19.183 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 146 kB of additional disk space will be used. 2026-03-09T21:54:19.183 INFO:teuthology.orchestra.run.vm11.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-09T21:54:19.201 INFO:teuthology.orchestra.run.vm11.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-09T21:54:19.389 INFO:teuthology.orchestra.run.vm11.stdout:Fetched 34.3 kB in 0s (328 kB/s) 2026-03-09T21:54:19.852 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-jmespath. 2026-03-09T21:54:19.883 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118577 files and directories currently installed.) 2026-03-09T21:54:19.885 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-09T21:54:19.888 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-09T21:54:19.907 INFO:teuthology.orchestra.run.vm11.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-09T21:54:19.911 INFO:teuthology.orchestra.run.vm11.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-09T21:54:19.912 INFO:teuthology.orchestra.run.vm11.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-09T21:54:19.942 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-09T21:54:20.013 INFO:teuthology.orchestra.run.vm11.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-09T21:54:20.366 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:20.366 INFO:teuthology.orchestra.run.vm11.stdout:Running kernel seems to be up-to-date. 2026-03-09T21:54:20.366 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:20.366 INFO:teuthology.orchestra.run.vm11.stdout:Services to be restarted: 2026-03-09T21:54:20.372 INFO:teuthology.orchestra.run.vm11.stdout: systemctl restart packagekit.service 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout:Service restarts being deferred: 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout: systemctl restart networkd-dispatcher.service 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout: systemctl restart unattended-upgrades.service 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout:No containers need to be restarted. 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout:No user sessions are running outdated binaries. 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout: 2026-03-09T21:54:20.376 INFO:teuthology.orchestra.run.vm11.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-09T21:54:21.333 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T21:54:21.336 DEBUG:teuthology.parallel:result is None 2026-03-09T21:54:21.336 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T21:54:21.920 DEBUG:teuthology.orchestra.run.vm11:> dpkg-query -W -f '${Version}' ceph 2026-03-09T21:54:21.929 INFO:teuthology.orchestra.run.vm11.stdout:19.2.3-678-ge911bdeb-1jammy 2026-03-09T21:54:21.929 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678-ge911bdeb-1jammy 2026-03-09T21:54:21.929 INFO:teuthology.task.install:The correct ceph version 19.2.3-678-ge911bdeb-1jammy is installed. 2026-03-09T21:54:21.930 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-09T21:54:21.931 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:54:21.931 DEBUG:teuthology.orchestra.run.vm11:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T21:54:21.978 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-09T21:54:21.978 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:54:21.978 DEBUG:teuthology.orchestra.run.vm11:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T21:54:22.026 DEBUG:teuthology.orchestra.run.vm11:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T21:54:22.074 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-09T21:54:22.074 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:54:22.075 DEBUG:teuthology.orchestra.run.vm11:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T21:54:22.123 DEBUG:teuthology.orchestra.run.vm11:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T21:54:22.175 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-09T21:54:22.175 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:54:22.175 DEBUG:teuthology.orchestra.run.vm11:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T21:54:22.224 DEBUG:teuthology.orchestra.run.vm11:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T21:54:22.277 INFO:teuthology.run_tasks:Running task exec... 2026-03-09T21:54:22.280 INFO:teuthology.task.exec:Executing custom commands... 2026-03-09T21:54:22.280 INFO:teuthology.task.exec:Running commands on role mon.a host ubuntu@vm11.local 2026-03-09T21:54:22.280 DEBUG:teuthology.orchestra.run.vm11:> sudo TESTDIR=/home/ubuntu/cephtest bash -c 'yum install -y python3 || apt install -y python3' 2026-03-09T21:54:22.328 INFO:teuthology.orchestra.run.vm11.stderr:bash: line 1: yum: command not found 2026-03-09T21:54:22.332 INFO:teuthology.orchestra.run.vm11.stderr: 2026-03-09T21:54:22.332 INFO:teuthology.orchestra.run.vm11.stderr:WARNING: apt does not have a stable CLI interface. Use with caution in scripts. 2026-03-09T21:54:22.332 INFO:teuthology.orchestra.run.vm11.stderr: 2026-03-09T21:54:22.362 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T21:54:22.585 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T21:54:22.586 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T21:54:22.760 INFO:teuthology.orchestra.run.vm11.stdout:python3 is already the newest version (3.10.6-1~22.04.1). 2026-03-09T21:54:22.760 INFO:teuthology.orchestra.run.vm11.stdout:python3 set to manually installed. 2026-03-09T21:54:22.760 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T21:54:22.761 INFO:teuthology.orchestra.run.vm11.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-09T21:54:22.761 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-09T21:54:22.761 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T21:54:22.847 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T21:54:22.905 INFO:teuthology.run_tasks:Running task workunit... 2026-03-09T21:54:22.909 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T21:54:22.909 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-09T21:54:22.909 DEBUG:teuthology.orchestra.run.vm11:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-09T21:54:22.951 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T21:54:22.951 INFO:teuthology.orchestra.run.vm11.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-09T21:54:22.951 DEBUG:teuthology.orchestra.run.vm11:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-09T21:54:22.998 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-09T21:54:22.998 DEBUG:teuthology.orchestra.run.vm11:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-09T21:54:23.042 INFO:tasks.workunit:timeout=3h 2026-03-09T21:54:23.042 INFO:tasks.workunit:cleanup=True 2026-03-09T21:54:23.042 DEBUG:teuthology.orchestra.run.vm11:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T21:54:23.087 INFO:tasks.workunit.client.0.vm11.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: git switch -c 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:Or undo this operation with: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: git switch - 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr: 2026-03-09T21:55:24.703 INFO:tasks.workunit.client.0.vm11.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T21:55:24.710 DEBUG:teuthology.orchestra.run.vm11:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-09T21:55:24.754 INFO:tasks.workunit.client.0.vm11.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T21:55:24.755 INFO:tasks.workunit.client.0.vm11.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T21:55:24.755 INFO:tasks.workunit.client.0.vm11.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T21:55:24.799 INFO:tasks.workunit.client.0.vm11.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T21:55:24.829 INFO:tasks.workunit.client.0.vm11.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T21:55:24.856 INFO:tasks.workunit.client.0.vm11.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T21:55:24.857 INFO:tasks.workunit.client.0.vm11.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T21:55:24.857 INFO:tasks.workunit.client.0.vm11.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T21:55:24.880 INFO:tasks.workunit.client.0.vm11.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T21:55:24.882 DEBUG:teuthology.orchestra.run.vm11:> set -ex 2026-03-09T21:55:24.882 DEBUG:teuthology.orchestra.run.vm11:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-09T21:55:24.927 INFO:tasks.workunit:Running workunits matching cephadm/test_cephadm.sh on client.0... 2026-03-09T21:55:24.928 INFO:tasks.workunit:Running workunit cephadm/test_cephadm.sh... 2026-03-09T21:55:24.928 DEBUG:teuthology.orchestra.run.vm11:workunit test cephadm/test_cephadm.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh 2026-03-09T21:55:24.974 INFO:tasks.workunit.client.0.vm11.stderr:++ basename /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh 2026-03-09T21:55:24.974 INFO:tasks.workunit.client.0.vm11.stderr:+ SCRIPT_NAME=test_cephadm.sh 2026-03-09T21:55:24.974 INFO:tasks.workunit.client.0.vm11.stderr:+++ dirname /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh 2026-03-09T21:55:24.975 INFO:tasks.workunit.client.0.vm11.stderr:++ cd /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm 2026-03-09T21:55:24.975 INFO:tasks.workunit.client.0.vm11.stderr:++ pwd 2026-03-09T21:55:24.975 INFO:tasks.workunit.client.0.vm11.stderr:+ SCRIPT_DIR=/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm 2026-03-09T21:55:24.975 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -z '' ']' 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ CLEANUP=true 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ FSID=00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ IMAGE_MAIN=quay.ceph.io/ceph-ci/ceph:main 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ IMAGE_QUINCY=quay.ceph.io/ceph-ci/ceph:quincy 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ IMAGE_REEF=quay.ceph.io/ceph-ci/ceph:reef 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ IMAGE_SQUID=quay.ceph.io/ceph-ci/ceph:squid 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ IMAGE_DEFAULT=quay.ceph.io/ceph-ci/ceph:squid 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ OSD_IMAGE_NAME=test_cephadm_osd.img 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ OSD_IMAGE_SIZE=6G 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ OSD_TO_CREATE=2 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ OSD_VG_NAME=test_cephadm 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ OSD_LV_NAME=test_cephadm 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -d '' ']' 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:++ mktemp -d tmp.test_cephadm.sh.XXXXXX 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ TMPDIR=tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -d '' ']' 2026-03-09T21:55:24.976 INFO:tasks.workunit.client.0.vm11.stderr:++ mktemp -d tmp.test_cephadm.sh.XXXXXX 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ TMPDIR_TEST_MULTIPLE_MOUNTS=tmp.test_cephadm.sh.Lgi6Ng 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPHADM_SRC_DIR=/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPHADM_SAMPLES_DIR=/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -z '' ']' 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ SUDO=sudo 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -z '' ']' 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:+ command -v cephadm 2026-03-09T21:55:24.977 INFO:tasks.workunit.client.0.vm11.stderr:++ command -v cephadm 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPHADM=/usr/sbin/cephadm 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -z /usr/sbin/cephadm ']' 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -x /usr/sbin/cephadm ']' 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPHADM_ARGS=' --image quay.ceph.io/ceph-ci/ceph:squid' 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPHADM_BIN=/usr/sbin/cephadm 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPHADM='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid' 2026-03-09T21:55:24.978 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --force 2026-03-09T21:55:25.067 INFO:tasks.workunit.client.0.vm11.stdout:Deleting cluster with fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:55:26.317 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo vgchange -an test_cephadm 2026-03-09T21:55:26.329 INFO:tasks.workunit.client.0.vm11.stderr: Volume group "test_cephadm" not found 2026-03-09T21:55:26.329 INFO:tasks.workunit.client.0.vm11.stderr: Cannot process volume group test_cephadm 2026-03-09T21:55:26.365 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:55:26.365 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo losetup -a 2026-03-09T21:55:26.365 INFO:tasks.workunit.client.0.vm11.stderr:++ awk -F : '{print $1}' 2026-03-09T21:55:26.366 INFO:tasks.workunit.client.0.vm11.stderr:+++ basename test_cephadm_osd.img 2026-03-09T21:55:26.369 INFO:tasks.workunit.client.0.vm11.stderr:++ grep test_cephadm_osd.img 2026-03-09T21:55:26.371 INFO:tasks.workunit.client.0.vm11.stderr:+ loopdev= 2026-03-09T21:55:26.371 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' '' = '' ']' 2026-03-09T21:55:26.371 INFO:tasks.workunit.client.0.vm11.stderr:+ trap cleanup EXIT 2026-03-09T21:55:26.371 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid check-host 2026-03-09T21:55:26.472 INFO:tasks.workunit.client.0.vm11.stderr:docker (/usr/bin/docker) is present 2026-03-09T21:55:26.472 INFO:tasks.workunit.client.0.vm11.stderr:systemctl is present 2026-03-09T21:55:26.472 INFO:tasks.workunit.client.0.vm11.stderr:lvcreate is present 2026-03-09T21:55:26.496 INFO:tasks.workunit.client.0.vm11.stderr:Unit ntp.service is enabled and running 2026-03-09T21:55:26.496 INFO:tasks.workunit.client.0.vm11.stderr:Host looks OK 2026-03-09T21:55:26.515 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid gather-facts 2026-03-09T21:55:26.655 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:55:26.655 INFO:tasks.workunit.client.0.vm11.stdout: "arch": "x86_64", 2026-03-09T21:55:26.655 INFO:tasks.workunit.client.0.vm11.stdout: "bios_date": "04/01/2014", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "bios_version": "1.16.3-debian-1.16.3-2", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "board_serial": "Unknown", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "chassis_serial": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "cpu_cores": 1, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "cpu_count": 2, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "cpu_load": { 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "15min": 0.29, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "1min": 1.16, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "5min": 0.67 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "cpu_model": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "cpu_threads": 1, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_count": 0, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosures": {}, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "flash_capacity": "0.0", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "flash_capacity_bytes": 0, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "flash_count": 0, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "flash_list": [], 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "fqdn": "vm11.local", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "hdd_capacity": "128.8GB", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "hdd_capacity_bytes": 128849018880, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "hdd_count": 5, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "hdd_list": [ 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: { 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "alt_dev_name": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "dev_name": "vdd", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "disk_size_bytes": 21474836480, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "disk_type": "hdd", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_id": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_slot": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "model": "Unknown", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "mpath": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:08.0", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "Unknown", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "scsi_addr": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "serial": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "Virtio Block Device", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "wwid": "Unknown" 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: { 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "alt_dev_name": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "dev_name": "vdb", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "disk_size_bytes": 21474836480, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "disk_type": "hdd", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_id": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_slot": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "model": "Unknown", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "mpath": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:06.0", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "Unknown", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "scsi_addr": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "serial": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "Virtio Block Device", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "wwid": "Unknown" 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: { 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "alt_dev_name": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "dev_name": "vde", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "disk_size_bytes": 21474836480, 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "disk_type": "hdd", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_id": "", 2026-03-09T21:55:26.656 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_slot": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "model": "Unknown", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "mpath": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:09.0", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "Unknown", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "scsi_addr": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "serial": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "Virtio Block Device", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "wwid": "Unknown" 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: { 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "alt_dev_name": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "dev_name": "vdc", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "disk_size_bytes": 21474836480, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "disk_type": "hdd", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_id": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_slot": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "model": "Unknown", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "mpath": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:07.0", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "Unknown", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "scsi_addr": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "serial": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "Virtio Block Device", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "wwid": "Unknown" 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: { 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "alt_dev_name": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "description": "Virtio Block Device Unknown (42.9GB)", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "dev_name": "vda", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "disk_size_bytes": 42949672960, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "disk_type": "hdd", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_id": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "enclosure_slot": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "model": "Unknown", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "mpath": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:05.0", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "Unknown", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "scsi_addr": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "serial": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "Virtio Block Device", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "wwid": "Unknown" 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: } 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "hostname": "vm11", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "interfaces": { 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "docker0": { 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "driver": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "iftype": "logical", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "ipv4_address": "172.17.0.1/16", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "ipv6_address": "", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "lower_devs_list": [], 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "mtu": 1500, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "nic_type": "bridge", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "operstate": "down", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "speed": -1, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "upper_devs_list": [] 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "ens3": { 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "driver": "virtio_net", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "iftype": "physical", 2026-03-09T21:55:26.657 INFO:tasks.workunit.client.0.vm11.stdout: "ipv4_address": "192.168.123.111/24", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "ipv6_address": "fe80::5055:ff:fe00:b/64", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "lower_devs_list": [], 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "mtu": 1500, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "nic_type": "ethernet", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "operstate": "up", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "speed": -1, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "upper_devs_list": [] 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: } 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "kernel": "5.15.0-1092-kvm", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "kernel_parameters": { 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_nonlocal_bind": "0" 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "kernel_security": { 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "description": "AppArmor: Enabled(40 enforce)", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "enforce": 40, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "type": "AppArmor" 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "memory_available_kb": 7831984, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "memory_free_kb": 1814728, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "memory_total_kb": 8156572, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "model": " (Standard PC (i440FX + PIIX, 1996))", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "nic_count": 1, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "operating_system": "Ubuntu 22.04.5 LTS (Jammy Jellyfish)", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "product_serial": "", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "selinux_enabled": false, 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "shortname": "vm11", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "subscribed": "Unknown", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "sysctl_options": { 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "abi.vsyscall32": "1", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "debug.exception-trace": "1", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.cdrom.autoclose": "1", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.cdrom.autoeject": "0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.cdrom.check_media": "0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.cdrom.debug": "0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.cdrom.info": "", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.cdrom.lock": "0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.scsi.logging_level": "0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "dev.tty.ldisc_autoload": "1", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.aio-max-nr": "1048576", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.aio-nr": "0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.dentry-state": "68386\t57935\t45\t0\t16116\t0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.epoll.max_user_watches": "1814839", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.fanotify.max_queued_events": "16384", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.fanotify.max_user_groups": "128", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.fanotify.max_user_marks": "66044", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.file-max": "9223372036854775807", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.file-nr": "1280\t0\t9223372036854775807", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.inode-nr": "116999\t75010", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.inode-state": "116999\t75010\t0\t0\t0\t0\t0", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.inotify.max_queued_events": "16384", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.inotify.max_user_instances": "128", 2026-03-09T21:55:26.658 INFO:tasks.workunit.client.0.vm11.stdout: "fs.inotify.max_user_watches": "62113", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.lease-break-time": "45", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.leases-enable": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.mount-max": "100000", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.mqueue.msg_default": "10", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.mqueue.msg_max": "10", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.mqueue.msgsize_default": "8192", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.mqueue.msgsize_max": "8192", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.mqueue.queues_max": "256", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.nr_open": "1048576", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.overflowgid": "65534", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.overflowuid": "65534", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.pipe-max-size": "1048576", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.pipe-user-pages-hard": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.pipe-user-pages-soft": "16384", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.protected_fifos": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.protected_hardlinks": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.protected_regular": "2", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.protected_symlinks": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.allocated_dquots": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.cache_hits": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.drops": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.free_dquots": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.lookups": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.reads": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.syncs": "16", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.warnings": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.quota.writes": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.suid_dumpable": "2", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "fs.verity.require_signatures": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.apparmor_display_secid_mode": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.auto_msgmni": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.bootloader_type": "114", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.bootloader_version": "2", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.bpf_stats_enabled": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.cad_pid": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.cap_last_cap": "40", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.core_pattern": "/home/ubuntu/cephtest/archive/coredump/%t.%p.core", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.core_pipe_limit": "10", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.core_uses_pid": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.ctrl-alt-del": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.dmesg_restrict": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.domainname": "(none)", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.ftrace_dump_on_oops": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.hostname": "vm11", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.hotplug": "", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.io_delay_type": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.io_uring_disabled": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.io_uring_group": "-1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.kexec_load_disabled": "0", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.keys.gc_delay": "300", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.keys.maxbytes": "20000", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.keys.maxkeys": "200", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.keys.root_maxbytes": "25000000", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.keys.root_maxkeys": "1000000", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.kptr_restrict": "1", 2026-03-09T21:55:26.659 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.max_lock_depth": "1024", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.max_rcu_stall_to_panic": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.modprobe": "/sbin/modprobe", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.modules_disabled": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.msg_next_id": "-1", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.msgmax": "8192", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.msgmnb": "16384", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.msgmni": "32000", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.ngroups_max": "65536", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.ns_last_pid": "19554", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.numa_balancing": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.oops_all_cpu_backtrace": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.oops_limit": "10000", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.osrelease": "5.15.0-1092-kvm", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.ostype": "Linux", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.overflowgid": "65534", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.overflowuid": "65534", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic": "-1", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic_on_io_nmi": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic_on_oops": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic_on_rcu_stall": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic_on_unrecovered_nmi": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic_on_warn": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.panic_print": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.perf_cpu_time_max_percent": "25", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.perf_event_max_contexts_per_stack": "8", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.perf_event_max_sample_rate": "100000", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.perf_event_max_stack": "127", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.perf_event_mlock_kb": "516", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.perf_event_paranoid": "4", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.pid_max": "4194304", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.poweroff_cmd": "/sbin/poweroff", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.print-fatal-signals": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.printk": "4\t4\t1\t7", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.printk_delay": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.printk_devkmsg": "on", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.printk_ratelimit": "5", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.printk_ratelimit_burst": "10", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.pty.max": "4096", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.pty.nr": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.pty.reserve": "1024", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.random.boot_id": "e08a98d4-de1c-48bb-bf94-e32aa7804cde", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.random.entropy_avail": "256", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.random.poolsize": "256", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.random.urandom_min_reseed_secs": "60", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.random.uuid": "78949af9-f1e0-405a-9839-1a54a5e33bc1", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.random.write_wakeup_threshold": "256", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.randomize_va_space": "2", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.real-root-dev": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_cfs_bandwidth_slice_us": "5000", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_child_runs_first": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_deadline_period_max_us": "4194304", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_deadline_period_min_us": "100", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_energy_aware": "1", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_rr_timeslice_ms": "100", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_rt_period_us": "1000000", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_rt_runtime_us": "950000", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_schedstats": "0", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_util_clamp_max": "1024", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_util_clamp_min": "1024", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sched_util_clamp_min_rt_default": "1024", 2026-03-09T21:55:26.660 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.seccomp.actions_avail": "kill_process kill_thread trap errno user_notif trace log allow", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.seccomp.actions_logged": "kill_process kill_thread trap errno user_notif trace log", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sem": "32000\t1024000000\t500\t32000", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sem_next_id": "-1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.shm_next_id": "-1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.shm_rmid_forced": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.shmall": "18446744073692774399", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.shmmax": "18446744073692774399", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.shmmni": "4096", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.sysctl_writes_strict": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.tainted": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.task_delayacct": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.threads-max": "63692", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.timer_migration": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.traceoff_on_warning": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.tracepoint_printk": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.unknown_nmi_panic": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.unprivileged_bpf_disabled": "2", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.unprivileged_userns_apparmor_policy": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.unprivileged_userns_clone": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.usermodehelper.bset": "4294967295\t511", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.usermodehelper.inheritable": "4294967295\t511", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.version": "#97-Ubuntu SMP Fri Jan 23 15:00:24 UTC 2026", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.warn_limit": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "kernel.yama.ptrace_scope": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.bpf_jit_enable": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.bpf_jit_harden": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.bpf_jit_kallsyms": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.bpf_jit_limit": "528482304", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.busy_poll": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.busy_read": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.default_qdisc": "pfifo_fast", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.dev_weight": "64", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.dev_weight_rx_bias": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.dev_weight_tx_bias": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.devconf_inherit_init_net": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.fb_tunnels_only_for_init_net": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.flow_limit_cpu_bitmap": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.flow_limit_table_len": "4096", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.gro_normal_batch": "8", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.high_order_alloc_disable": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.max_skb_frags": "17", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.message_burst": "10", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.message_cost": "5", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.netdev_budget": "300", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.netdev_budget_usecs": "8000", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.netdev_max_backlog": "1000", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.netdev_rss_key": "00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.netdev_tstamp_prequeue": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.netdev_unregister_timeout_secs": "10", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.optmem_max": "20480", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.rmem_default": "212992", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.rmem_max": "212992", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.rps_sock_flow_entries": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.somaxconn": "4096", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.tstamp_allow_data": "1", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.warnings": "0", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.wmem_default": "212992", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.wmem_max": "212992", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.xfrm_acq_expires": "30", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.xfrm_aevent_etime": "10", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.xfrm_aevent_rseqth": "2", 2026-03-09T21:55:26.661 INFO:tasks.workunit.client.0.vm11.stdout: "net.core.xfrm_larval_drop": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.cipso_cache_bucket_size": "10", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.cipso_cache_enable": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.cipso_rbm_optfmt": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.cipso_rbm_strictvalid": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.accept_local": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.accept_redirects": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.accept_source_route": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.arp_accept": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.arp_announce": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.arp_filter": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.arp_ignore": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.arp_notify": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.bc_forwarding": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.bootp_relay": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.disable_policy": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.disable_xfrm": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.drop_gratuitous_arp": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.force_igmp_version": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.forwarding": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.igmpv2_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.igmpv3_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.log_martians": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.mc_forwarding": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.medium_id": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.promote_secondaries": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.proxy_arp": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.proxy_arp_pvlan": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.route_localnet": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.rp_filter": "2", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.secure_redirects": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.send_redirects": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.shared_media": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.src_valid_mark": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.all.tag": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.accept_local": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.accept_redirects": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.accept_source_route": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.arp_accept": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.arp_announce": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.arp_filter": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.arp_ignore": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.arp_notify": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.bc_forwarding": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.bootp_relay": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.disable_policy": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.disable_xfrm": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.drop_gratuitous_arp": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.force_igmp_version": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.forwarding": "1", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.igmpv2_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.igmpv3_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.log_martians": "0", 2026-03-09T21:55:26.662 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.mc_forwarding": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.medium_id": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.promote_secondaries": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.proxy_arp": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.proxy_arp_pvlan": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.route_localnet": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.rp_filter": "2", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.secure_redirects": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.send_redirects": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.shared_media": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.src_valid_mark": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.default.tag": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.accept_local": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.accept_redirects": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.accept_source_route": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.arp_accept": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.arp_announce": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.arp_filter": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.arp_ignore": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.arp_notify": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.bc_forwarding": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.bootp_relay": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.disable_policy": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.disable_xfrm": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.drop_gratuitous_arp": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.force_igmp_version": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.forwarding": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.igmpv2_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.igmpv3_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.log_martians": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.mc_forwarding": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.medium_id": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.promote_secondaries": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.proxy_arp": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.proxy_arp_pvlan": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.route_localnet": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.rp_filter": "2", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.secure_redirects": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.send_redirects": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.shared_media": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.src_valid_mark": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.docker0.tag": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.accept_local": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.accept_redirects": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.accept_source_route": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.arp_accept": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.arp_announce": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.arp_filter": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.arp_ignore": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.arp_notify": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.bc_forwarding": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.bootp_relay": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.disable_policy": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.disable_xfrm": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.drop_gratuitous_arp": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.force_igmp_version": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.forwarding": "1", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.igmpv2_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.igmpv3_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.663 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.log_martians": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.mc_forwarding": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.medium_id": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.promote_secondaries": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.proxy_arp": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.proxy_arp_pvlan": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.route_localnet": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.rp_filter": "2", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.secure_redirects": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.send_redirects": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.shared_media": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.src_valid_mark": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.ens3.tag": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.accept_local": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.accept_redirects": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.accept_source_route": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.arp_accept": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.arp_announce": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.arp_filter": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.arp_ignore": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.arp_notify": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.bc_forwarding": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.bootp_relay": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.disable_policy": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.disable_xfrm": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.drop_gratuitous_arp": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.force_igmp_version": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.forwarding": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.igmpv2_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.igmpv3_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.log_martians": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.mc_forwarding": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.medium_id": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.promote_secondaries": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.proxy_arp": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.proxy_arp_pvlan": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.route_localnet": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.rp_filter": "2", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.secure_redirects": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.send_redirects": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.shared_media": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.src_valid_mark": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.conf.lo.tag": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.fib_notify_on_flag_change": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.fib_sync_mem": "524288", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.fwmark_reflect": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_echo_enable_probe": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_echo_ignore_all": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_echo_ignore_broadcasts": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_errors_use_inbound_ifaddr": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_ignore_bogus_error_responses": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_msgs_burst": "50", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_msgs_per_sec": "1000", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_ratelimit": "1000", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.icmp_ratemask": "6168", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.igmp_link_local_mcast_reports": "1", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.igmp_max_memberships": "20", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.igmp_max_msf": "10", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.inet_peer_maxttl": "600", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.inet_peer_minttl": "120", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.inet_peer_threshold": "65664", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_autobind_reuse": "0", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_default_ttl": "64", 2026-03-09T21:55:26.664 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_dynaddr": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_early_demux": "1", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_forward": "1", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_forward_update_priority": "1", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_forward_use_pmtu": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_local_port_range": "32768\t60999", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_local_reserved_ports": "", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_no_pmtu_disc": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_nonlocal_bind": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ip_unprivileged_port_start": "1024", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ipfrag_high_thresh": "4194304", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ipfrag_low_thresh": "3145728", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ipfrag_max_dist": "64", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ipfrag_secret_interval": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ipfrag_time": "30", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.anycast_delay": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.app_solicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.delay_first_probe_time": "5", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.gc_interval": "30", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.gc_stale_time": "60", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.gc_thresh1": "128", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.gc_thresh2": "512", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.gc_thresh3": "1024", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.locktime": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.mcast_resolicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.mcast_solicit": "3", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.proxy_delay": "80", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.proxy_qlen": "64", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.retrans_time_ms": "1000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.ucast_solicit": "3", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.unres_qlen": "101", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.default.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.anycast_delay": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.app_solicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.delay_first_probe_time": "5", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.gc_stale_time": "60", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.locktime": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.mcast_resolicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.mcast_solicit": "3", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.proxy_delay": "80", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.proxy_qlen": "64", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.retrans_time_ms": "1000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.ucast_solicit": "3", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.unres_qlen": "101", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.docker0.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.anycast_delay": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.app_solicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.delay_first_probe_time": "5", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.gc_stale_time": "60", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.locktime": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.mcast_resolicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.mcast_solicit": "3", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.proxy_delay": "80", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.proxy_qlen": "64", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.retrans_time_ms": "1000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.ucast_solicit": "3", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.unres_qlen": "101", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.ens3.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.anycast_delay": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.app_solicit": "0", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.delay_first_probe_time": "5", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.gc_stale_time": "60", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.locktime": "100", 2026-03-09T21:55:26.665 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.mcast_resolicit": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.mcast_solicit": "3", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.proxy_delay": "80", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.proxy_qlen": "64", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.retrans_time_ms": "1000", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.ucast_solicit": "3", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.unres_qlen": "101", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.neigh.lo.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.nexthop_compat_mode": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.ping_group_range": "0\t2147483647", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.raw_l3mdev_accept": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.error_burst": "1250", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.error_cost": "250", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.gc_elasticity": "8", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.gc_interval": "60", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.gc_min_interval": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.gc_min_interval_ms": "500", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.gc_thresh": "-1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.gc_timeout": "300", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.max_size": "2147483647", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.min_adv_mss": "256", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.min_pmtu": "552", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.mtu_expires": "600", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.redirect_load": "5", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.redirect_number": "9", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.route.redirect_silence": "5120", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_abort_on_overflow": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_adv_win_scale": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_allowed_congestion_control": "reno cubic", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_app_win": "31", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_autocorking": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_available_congestion_control": "reno cubic", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_available_ulp": "espintcp mptcp", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_base_mss": "1024", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_challenge_ack_limit": "1000", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_comp_sack_delay_ns": "1000000", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_comp_sack_nr": "44", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_comp_sack_slack_ns": "100000", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_congestion_control": "cubic", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_dsack": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_early_demux": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_early_retrans": "3", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_ecn": "2", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_ecn_fallback": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_fack": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_fastopen": "1", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_fastopen_blackhole_timeout_sec": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_fastopen_key": "aba4252e-efab1998-9e9ccbfc-fe810ade", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_fin_timeout": "60", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_frto": "2", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_fwmark_accept": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_invalid_ratelimit": "500", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_keepalive_intvl": "75", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_keepalive_probes": "9", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_keepalive_time": "7200", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_l3mdev_accept": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_limit_output_bytes": "1048576", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_low_latency": "0", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_max_orphans": "32768", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_max_reordering": "300", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_max_syn_backlog": "512", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_max_tw_buckets": "32768", 2026-03-09T21:55:26.666 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_mem": "95214\t126952\t190428", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_migrate_req": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_min_rtt_wlen": "300", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_min_snd_mss": "48", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_min_tso_segs": "2", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_moderate_rcvbuf": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_mtu_probe_floor": "48", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_mtu_probing": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_no_metrics_save": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_no_ssthresh_metrics_save": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_notsent_lowat": "4294967295", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_orphan_retries": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_pacing_ca_ratio": "120", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_pacing_ss_ratio": "200", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_probe_interval": "600", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_probe_threshold": "8", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_recovery": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_reflect_tos": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_reordering": "3", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_retrans_collapse": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_retries1": "3", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_retries2": "15", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_rfc1337": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_rmem": "4096\t131072\t6291456", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_rx_skb_cache": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_sack": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_slow_start_after_idle": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_stdurg": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_syn_retries": "6", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_synack_retries": "5", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_syncookies": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_thin_linear_timeouts": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_timestamps": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_tso_win_divisor": "3", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_tw_reuse": "2", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_tx_skb_cache": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_window_scaling": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_wmem": "4096\t16384\t4194304", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.tcp_workaround_signed_windows": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.udp_early_demux": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.udp_l3mdev_accept": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.udp_mem": "190428\t253904\t380856", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.udp_rmem_min": "4096", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.udp_wmem_min": "4096", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv4.xfrm4_gc_thresh": "32768", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.anycast_src_echo_reply": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.auto_flowlabels": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.bindv6only": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.calipso_cache_bucket_size": "10", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.calipso_cache_enable": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_dad": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra_defrtr": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra_from_local": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra_min_hop_limit": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra_min_lft": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra_mtu": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_ra_pinfo": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_redirects": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.accept_source_route": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.addr_gen_mode": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.autoconf": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.dad_transmits": "1", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.disable_ipv6": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.disable_policy": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.667 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.drop_unsolicited_na": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.enhanced_dad": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.force_mld_version": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.force_tllao": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.forwarding": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.hop_limit": "64", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ioam6_enabled": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ioam6_id": "65535", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ioam6_id_wide": "4294967295", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.keep_addr_on_down": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.max_addresses": "16", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.max_desync_factor": "600", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.mldv1_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.mldv2_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.mtu": "1280", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ndisc_notify": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ndisc_tclass": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.proxy_ndp": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.ra_defrtr_metric": "1024", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.regen_max_retry": "3", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.router_solicitation_delay": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.router_solicitation_interval": "4", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.router_solicitation_max_interval": "3600", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.router_solicitations": "-1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.rpl_seg_enabled": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.seg6_enabled": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.suppress_frag_ndisc": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.temp_prefered_lft": "86400", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.temp_valid_lft": "604800", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.use_oif_addrs_only": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.all.use_tempaddr": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_dad": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra_defrtr": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra_from_local": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra_min_hop_limit": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra_min_lft": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra_mtu": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_ra_pinfo": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_redirects": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.accept_source_route": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.addr_gen_mode": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.autoconf": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.dad_transmits": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.disable_ipv6": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.disable_policy": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.drop_unsolicited_na": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.enhanced_dad": "1", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.force_mld_version": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.force_tllao": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.forwarding": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.hop_limit": "64", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ioam6_enabled": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ioam6_id": "65535", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ioam6_id_wide": "4294967295", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.keep_addr_on_down": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.max_addresses": "16", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.max_desync_factor": "600", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.mldv1_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.mldv2_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.mtu": "1280", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ndisc_notify": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ndisc_tclass": "0", 2026-03-09T21:55:26.668 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.proxy_ndp": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.ra_defrtr_metric": "1024", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.regen_max_retry": "3", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.router_solicitation_delay": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.router_solicitation_interval": "4", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.router_solicitation_max_interval": "3600", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.router_solicitations": "-1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.rpl_seg_enabled": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.seg6_enabled": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.suppress_frag_ndisc": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.temp_prefered_lft": "86400", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.temp_valid_lft": "604800", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.use_oif_addrs_only": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.default.use_tempaddr": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_dad": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra_defrtr": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra_from_local": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra_min_hop_limit": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra_min_lft": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra_mtu": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_ra_pinfo": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_redirects": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.accept_source_route": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.addr_gen_mode": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.autoconf": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.dad_transmits": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.disable_ipv6": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.disable_policy": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.drop_unsolicited_na": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.enhanced_dad": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.force_mld_version": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.force_tllao": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.forwarding": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.hop_limit": "64", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ioam6_enabled": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ioam6_id": "65535", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ioam6_id_wide": "4294967295", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.keep_addr_on_down": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.max_addresses": "16", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.max_desync_factor": "600", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.mldv1_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.mldv2_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.mtu": "1500", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ndisc_notify": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ndisc_tclass": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.proxy_ndp": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.ra_defrtr_metric": "1024", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.regen_max_retry": "3", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.router_solicitation_delay": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.router_solicitation_interval": "4", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.router_solicitation_max_interval": "3600", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.router_solicitations": "-1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.rpl_seg_enabled": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.seg6_enabled": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.suppress_frag_ndisc": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.temp_prefered_lft": "86400", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.temp_valid_lft": "604800", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.use_oif_addrs_only": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.docker0.use_tempaddr": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_dad": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra_defrtr": "1", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra_from_local": "0", 2026-03-09T21:55:26.669 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra_min_hop_limit": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra_min_lft": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra_mtu": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_ra_pinfo": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_redirects": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.accept_source_route": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.addr_gen_mode": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.autoconf": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.dad_transmits": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.disable_ipv6": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.disable_policy": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.drop_unsolicited_na": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.enhanced_dad": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.force_mld_version": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.force_tllao": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.forwarding": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.hop_limit": "64", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ioam6_enabled": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ioam6_id": "65535", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ioam6_id_wide": "4294967295", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.keep_addr_on_down": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.max_addresses": "16", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.max_desync_factor": "600", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.mldv1_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.mldv2_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.mtu": "1500", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ndisc_notify": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ndisc_tclass": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.proxy_ndp": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.ra_defrtr_metric": "1024", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.regen_max_retry": "3", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.router_solicitation_delay": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.router_solicitation_interval": "4", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.router_solicitation_max_interval": "3600", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.router_solicitations": "-1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.rpl_seg_enabled": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.seg6_enabled": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.suppress_frag_ndisc": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.temp_prefered_lft": "86400", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.temp_valid_lft": "604800", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.use_oif_addrs_only": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.ens3.use_tempaddr": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_dad": "-1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra_defrtr": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra_from_local": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra_min_hop_limit": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra_min_lft": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra_mtu": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_ra_pinfo": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_redirects": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.accept_source_route": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.addr_gen_mode": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.autoconf": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.dad_transmits": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.disable_ipv6": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.disable_policy": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.drop_unicast_in_l2_multicast": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.drop_unsolicited_na": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.enhanced_dad": "1", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.force_mld_version": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.force_tllao": "0", 2026-03-09T21:55:26.670 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.forwarding": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.hop_limit": "64", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ignore_routes_with_linkdown": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ioam6_enabled": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ioam6_id": "65535", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ioam6_id_wide": "4294967295", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.keep_addr_on_down": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.max_addresses": "16", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.max_desync_factor": "600", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.mldv1_unsolicited_report_interval": "10000", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.mldv2_unsolicited_report_interval": "1000", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.mtu": "65536", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ndisc_notify": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ndisc_tclass": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.proxy_ndp": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.ra_defrtr_metric": "1024", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.regen_max_retry": "3", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.router_solicitation_delay": "1", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.router_solicitation_interval": "4", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.router_solicitation_max_interval": "3600", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.router_solicitations": "-1", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.rpl_seg_enabled": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.seg6_enabled": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.suppress_frag_ndisc": "1", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.temp_prefered_lft": "86400", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.temp_valid_lft": "604800", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.use_oif_addrs_only": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.conf.lo.use_tempaddr": "-1", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.fib_multipath_hash_fields": "7", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.fib_multipath_hash_policy": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.fib_notify_on_flag_change": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.flowlabel_consistency": "1", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.flowlabel_reflect": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.flowlabel_state_ranges": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.fwmark_reflect": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.icmp.echo_ignore_all": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.icmp.echo_ignore_anycast": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.icmp.echo_ignore_multicast": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.icmp.ratelimit": "1000", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.icmp.ratemask": "0-1,3-127", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.idgen_delay": "1", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.idgen_retries": "3", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ioam6_id": "16777215", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ioam6_id_wide": "72057594037927935", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ip6frag_high_thresh": "4194304", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ip6frag_low_thresh": "3145728", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ip6frag_secret_interval": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ip6frag_time": "60", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.ip_nonlocal_bind": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.max_dst_opts_length": "2147483647", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.max_dst_opts_number": "8", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.max_hbh_length": "2147483647", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.max_hbh_opts_number": "8", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.mld_max_msf": "64", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.mld_qrv": "2", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.anycast_delay": "100", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.app_solicit": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.delay_first_probe_time": "5", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.gc_interval": "30", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.gc_stale_time": "60", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.gc_thresh1": "128", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.gc_thresh2": "512", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.gc_thresh3": "1024", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.locktime": "0", 2026-03-09T21:55:26.671 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.mcast_resolicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.mcast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.proxy_delay": "80", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.proxy_qlen": "64", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.retrans_time_ms": "1000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.ucast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.unres_qlen": "101", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.default.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.anycast_delay": "100", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.app_solicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.delay_first_probe_time": "5", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.gc_stale_time": "60", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.locktime": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.mcast_resolicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.mcast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.proxy_delay": "80", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.proxy_qlen": "64", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.retrans_time_ms": "1000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.ucast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.unres_qlen": "101", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.docker0.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.anycast_delay": "100", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.app_solicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.delay_first_probe_time": "5", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.gc_stale_time": "60", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.locktime": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.mcast_resolicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.mcast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.proxy_delay": "80", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.proxy_qlen": "64", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.retrans_time_ms": "1000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.ucast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.unres_qlen": "101", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.ens3.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.anycast_delay": "100", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.app_solicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.base_reachable_time_ms": "30000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.delay_first_probe_time": "5", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.gc_stale_time": "60", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.locktime": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.mcast_resolicit": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.mcast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.proxy_delay": "80", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.proxy_qlen": "64", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.retrans_time_ms": "1000", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.ucast_solicit": "3", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.unres_qlen": "101", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.neigh.lo.unres_qlen_bytes": "212992", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.gc_elasticity": "9", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.gc_interval": "30", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.gc_min_interval": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.gc_min_interval_ms": "500", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.gc_thresh": "1024", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.gc_timeout": "60", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.max_size": "2147483647", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.min_adv_mss": "1220", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.mtu_expires": "600", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.route.skip_notify_on_dev_down": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.seg6_flowlabel": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.ipv6.xfrm6_gc_thresh": "32768", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.mptcp.add_addr_timeout": "120", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.mptcp.allow_join_initial_addr_port": "1", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.mptcp.checksum_enabled": "0", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.mptcp.enabled": "1", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.mptcp.stale_loss_cnt": "4", 2026-03-09T21:55:26.672 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_acct": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_buckets": "262144", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_checksum": "1", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_count": "86", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_loose": "1", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_closereq": "64", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_closing": "64", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_open": "43200", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_partopen": "480", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_request": "240", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_respond": "480", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_dccp_timeout_timewait": "240", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_events": "1", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_expect_max": "4096", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_frag6_high_thresh": "4194304", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_frag6_low_thresh": "3145728", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_frag6_timeout": "60", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_generic_timeout": "600", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_gre_timeout": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_gre_timeout_stream": "180", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_helper": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_icmp_timeout": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_icmpv6_timeout": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_log_invalid": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_max": "262144", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_closed": "10", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_cookie_echoed": "3", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_cookie_wait": "3", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_established": "210", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_heartbeat_sent": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_shutdown_ack_sent": "3", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_shutdown_recd": "3", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_sctp_timeout_shutdown_sent": "3", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_be_liberal": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_ignore_invalid_rst": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_loose": "1", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_max_retrans": "3", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_close": "10", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_close_wait": "60", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_established": "432000", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_fin_wait": "120", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_last_ack": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_max_retrans": "300", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_syn_recv": "60", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_syn_sent": "120", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_time_wait": "120", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_tcp_timeout_unacknowledged": "300", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_timestamp": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_udp_timeout": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_conntrack_udp_timeout_stream": "120", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_flowtable_tcp_timeout": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_flowtable_udp_timeout": "30", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_hooks_lwtunnel": "0", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.0": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.1": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.10": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.11": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.12": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.2": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.3": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.4": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.5": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.6": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.7": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.8": "NONE", 2026-03-09T21:55:26.673 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log.9": "NONE", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "net.netfilter.nf_log_all_netns": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "net.nf_conntrack_max": "262144", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "net.unix.max_dgram_qlen": "512", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "sunrpc.max_resvport": "1023", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "sunrpc.min_resvport": "665", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "sunrpc.tcp_fin_timeout": "15", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "sunrpc.tcp_max_slot_table_entries": "65536", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "sunrpc.tcp_slot_table_entries": "2", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "sunrpc.udp_slot_table_entries": "16", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_cgroup_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_fanotify_groups": "128", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_fanotify_marks": "66044", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_inotify_instances": "128", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_inotify_watches": "62113", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_ipc_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_mnt_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_net_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_pid_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_time_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_user_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "user.max_uts_namespaces": "31846", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.admin_reserve_kbytes": "8192", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirty_background_bytes": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirty_background_ratio": "10", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirty_bytes": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirty_expire_centisecs": "3000", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirty_ratio": "20", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirty_writeback_centisecs": "500", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.dirtytime_expire_seconds": "43200", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.hugetlb_shm_group": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.laptop_mode": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.legacy_va_layout": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.lowmem_reserve_ratio": "256\t32\t0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.max_map_count": "65530", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.min_free_kbytes": "11421", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.min_slab_ratio": "5", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.min_unmapped_ratio": "1", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.mmap_min_addr": "65536", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.mmap_rnd_bits": "28", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.mmap_rnd_compat_bits": "8", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.nr_hugepages": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.nr_hugepages_mempolicy": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.nr_overcommit_hugepages": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.numa_stat": "1", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.numa_zonelist_order": "Node", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.oom_dump_tasks": "1", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.oom_kill_allocating_task": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.overcommit_kbytes": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.overcommit_memory": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.overcommit_ratio": "50", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.page-cluster": "3", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.page_lock_unfairness": "5", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.panic_on_oom": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.percpu_pagelist_high_fraction": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.stat_interval": "1", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.swappiness": "60", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.unprivileged_userfaultfd": "0", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.user_reserve_kbytes": "131072", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.vfs_cache_pressure": "100", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.watermark_boost_factor": "15000", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.watermark_scale_factor": "10", 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "vm.zone_reclaim_mode": "0" 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "system_uptime": 309.24, 2026-03-09T21:55:26.674 INFO:tasks.workunit.client.0.vm11.stdout: "tcp6_ports_used": [ 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 41723, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 22, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 111 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: "tcp_ports_used": [ 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 22, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 111, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 34705, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 58759, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 5345, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 53 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: "timestamp": 1773093326.6551673, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: "udp6_ports_used": [ 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 111, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 123, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 123, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 123, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 37526 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: "udp_ports_used": [ 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 53, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 68, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 111, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 123, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 123, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 123, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 881, 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: 41907 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "QEMU" 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:55:26.675 INFO:tasks.workunit.client.0.vm11.stderr:+ /usr/sbin/cephadm version 2026-03-09T21:55:26.755 INFO:tasks.workunit.client.0.vm11.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-09T21:55:26.770 INFO:tasks.workunit.client.0.vm11.stderr:+ /usr/sbin/cephadm version 2026-03-09T21:55:26.770 INFO:tasks.workunit.client.0.vm11.stderr:+ grep 'cephadm version' 2026-03-09T21:55:26.866 INFO:tasks.workunit.client.0.vm11.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-09T21:55:26.866 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' -z '' ']' 2026-03-09T21:55:26.866 INFO:tasks.workunit.client.0.vm11.stderr:+ /usr/sbin/cephadm version 2026-03-09T21:55:26.867 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -v UNSET 2026-03-09T21:55:26.973 INFO:tasks.workunit.client.0.vm11.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-09T21:55:26.973 INFO:tasks.workunit.client.0.vm11.stderr:+ /usr/sbin/cephadm version 2026-03-09T21:55:26.973 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -v UNKNOWN 2026-03-09T21:55:27.079 INFO:tasks.workunit.client.0.vm11.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-09T21:55:27.079 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -- ceph -v 2026-03-09T21:55:27.079 INFO:tasks.workunit.client.0.vm11.stderr:+ grep 'ceph version' 2026-03-09T21:55:29.224 INFO:tasks.workunit.client.0.vm11.stderr:Unable to find image 'quay.ceph.io/ceph-ci/ceph:squid' locally 2026-03-09T21:55:30.550 INFO:tasks.workunit.client.0.vm11.stderr:squid: Pulling from ceph-ci/ceph 2026-03-09T21:55:30.550 INFO:tasks.workunit.client.0.vm11.stderr:8e380faede39: Pulling fs layer 2026-03-09T21:55:30.550 INFO:tasks.workunit.client.0.vm11.stderr:1752b8d01aa0: Pulling fs layer 2026-03-09T21:55:36.585 INFO:tasks.workunit.client.0.vm11.stderr:8e380faede39: Verifying Checksum 2026-03-09T21:55:36.585 INFO:tasks.workunit.client.0.vm11.stderr:8e380faede39: Download complete 2026-03-09T21:55:38.369 INFO:tasks.workunit.client.0.vm11.stderr:8e380faede39: Pull complete 2026-03-09T21:55:53.752 INFO:tasks.workunit.client.0.vm11.stderr:1752b8d01aa0: Verifying Checksum 2026-03-09T21:55:53.752 INFO:tasks.workunit.client.0.vm11.stderr:1752b8d01aa0: Download complete 2026-03-09T21:56:02.255 INFO:tasks.workunit.client.0.vm11.stderr:1752b8d01aa0: Pull complete 2026-03-09T21:56:02.258 INFO:tasks.workunit.client.0.vm11.stderr:Digest: sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T21:56:02.259 INFO:tasks.workunit.client.0.vm11.stderr:Status: Downloaded newer image for quay.ceph.io/ceph-ci/ceph:squid 2026-03-09T21:56:03.423 INFO:tasks.workunit.client.0.vm11.stdout:ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-09T21:56:03.423 INFO:tasks.workunit.client.0.vm11.stderr:+ grep FOO=BAR 2026-03-09T21:56:03.424 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -e FOO=BAR -- printenv 2026-03-09T21:56:05.795 INFO:tasks.workunit.client.0.vm11.stdout:FOO=BAR 2026-03-09T21:56:05.795 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell -- cat 2026-03-09T21:56:05.795 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q foo 2026-03-09T21:56:05.799 INFO:tasks.workunit.client.0.vm11.stderr:+ echo foo 2026-03-09T21:56:07.064 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --force 2026-03-09T21:56:07.153 INFO:tasks.workunit.client.0.vm11.stdout:Deleting cluster with fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:56:08.395 INFO:tasks.workunit.client.0.vm11.stderr:++ mktemp -p tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:56:08.396 INFO:tasks.workunit.client.0.vm11.stderr:+ ORIG_CONFIG=tmp.test_cephadm.sh.Qa6KUG/tmp.uLgnMiIMIl 2026-03-09T21:56:08.397 INFO:tasks.workunit.client.0.vm11.stderr:++ mktemp -p tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:56:08.397 INFO:tasks.workunit.client.0.vm11.stderr:+ CONFIG=tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 2026-03-09T21:56:08.397 INFO:tasks.workunit.client.0.vm11.stderr:++ mktemp -p tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:56:08.398 INFO:tasks.workunit.client.0.vm11.stderr:+ MONCONFIG=tmp.test_cephadm.sh.Qa6KUG/tmp.wkX7WVXszC 2026-03-09T21:56:08.398 INFO:tasks.workunit.client.0.vm11.stderr:++ mktemp -p tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:56:08.399 INFO:tasks.workunit.client.0.vm11.stderr:+ KEYRING=tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj 2026-03-09T21:56:08.399 INFO:tasks.workunit.client.0.vm11.stderr:+ IP=127.0.0.1 2026-03-09T21:56:08.399 INFO:tasks.workunit.client.0.vm11.stderr:+ cat 2026-03-09T21:56:08.400 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid bootstrap --mon-id a --mgr-id x --mon-ip 127.0.0.1 --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.uLgnMiIMIl --output-config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --output-keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj --output-pub-ssh-key tmp.test_cephadm.sh.Qa6KUG/ceph.pub --allow-overwrite --skip-mon-network --skip-monitoring-stack 2026-03-09T21:56:08.485 INFO:tasks.workunit.client.0.vm11.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-09T21:56:08.485 INFO:tasks.workunit.client.0.vm11.stdout:Verifying podman|docker is present... 2026-03-09T21:56:08.485 INFO:tasks.workunit.client.0.vm11.stdout:Verifying lvm2 is present... 2026-03-09T21:56:08.485 INFO:tasks.workunit.client.0.vm11.stdout:Verifying time synchronization is in place... 2026-03-09T21:56:08.511 INFO:tasks.workunit.client.0.vm11.stdout:Unit ntp.service is enabled and running 2026-03-09T21:56:08.511 INFO:tasks.workunit.client.0.vm11.stdout:Repeating the final host check... 2026-03-09T21:56:08.511 INFO:tasks.workunit.client.0.vm11.stdout:docker (/usr/bin/docker) is present 2026-03-09T21:56:08.511 INFO:tasks.workunit.client.0.vm11.stdout:systemctl is present 2026-03-09T21:56:08.511 INFO:tasks.workunit.client.0.vm11.stdout:lvcreate is present 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Unit ntp.service is enabled and running 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Host looks OK 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Cluster fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Verifying IP 127.0.0.1 port 3300 ... 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Verifying IP 127.0.0.1 port 6789 ... 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-09T21:56:08.535 INFO:tasks.workunit.client.0.vm11.stdout:Pulling container image quay.ceph.io/ceph-ci/ceph:squid... 2026-03-09T21:56:09.657 INFO:tasks.workunit.client.0.vm11.stdout:Ceph version: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-09T21:56:09.657 INFO:tasks.workunit.client.0.vm11.stdout:Extracting ceph user uid/gid from container image... 2026-03-09T21:56:09.762 INFO:tasks.workunit.client.0.vm11.stdout:Creating initial keys... 2026-03-09T21:56:10.279 INFO:tasks.workunit.client.0.vm11.stdout:Creating initial monmap... 2026-03-09T21:56:10.405 INFO:tasks.workunit.client.0.vm11.stdout:Creating mon... 2026-03-09T21:56:11.461 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for mon to start... 2026-03-09T21:56:11.461 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for mon... 2026-03-09T21:56:11.906 INFO:tasks.workunit.client.0.vm11.stdout:mon is available 2026-03-09T21:56:11.906 INFO:tasks.workunit.client.0.vm11.stdout:Assimilating anything we can from ceph.conf... 2026-03-09T21:56:12.112 INFO:tasks.workunit.client.0.vm11.stdout:Generating new minimal ceph.conf... 2026-03-09T21:56:12.310 INFO:tasks.workunit.client.0.vm11.stdout:Restarting the monitor... 2026-03-09T21:56:12.614 INFO:tasks.workunit.client.0.vm11.stdout:Wrote config to tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 2026-03-09T21:56:12.616 INFO:tasks.workunit.client.0.vm11.stdout:Wrote keyring to tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj 2026-03-09T21:56:12.616 INFO:tasks.workunit.client.0.vm11.stdout:Creating mgr... 2026-03-09T21:56:12.616 INFO:tasks.workunit.client.0.vm11.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-09T21:56:12.616 INFO:tasks.workunit.client.0.vm11.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-09T21:56:13.005 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for mgr to start... 2026-03-09T21:56:13.005 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for mgr... 2026-03-09T21:56:13.347 INFO:tasks.workunit.client.0.vm11.stdout:mgr not available, waiting (1/15)... 2026-03-09T21:56:15.865 INFO:tasks.workunit.client.0.vm11.stdout:mgr not available, waiting (2/15)... 2026-03-09T21:56:18.168 INFO:tasks.workunit.client.0.vm11.stdout:mgr is available 2026-03-09T21:56:18.403 INFO:tasks.workunit.client.0.vm11.stdout:Enabling cephadm module... 2026-03-09T21:56:19.891 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for the mgr to restart... 2026-03-09T21:56:19.891 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for mgr epoch 5... 2026-03-09T21:56:23.750 INFO:tasks.workunit.client.0.vm11.stdout:mgr epoch 5 is available 2026-03-09T21:56:23.750 INFO:tasks.workunit.client.0.vm11.stdout:Setting orchestrator backend to cephadm... 2026-03-09T21:56:24.288 INFO:tasks.workunit.client.0.vm11.stdout:Generating ssh key... 2026-03-09T21:56:24.832 INFO:tasks.workunit.client.0.vm11.stdout:Wrote public SSH key to tmp.test_cephadm.sh.Qa6KUG/ceph.pub 2026-03-09T21:56:24.832 INFO:tasks.workunit.client.0.vm11.stdout:Adding key to root@localhost authorized_keys... 2026-03-09T21:56:24.832 INFO:tasks.workunit.client.0.vm11.stdout:Adding host vm11... 2026-03-09T21:56:26.832 INFO:tasks.workunit.client.0.vm11.stdout:Deploying mon service with default placement... 2026-03-09T21:56:27.191 INFO:tasks.workunit.client.0.vm11.stdout:Deploying mgr service with default placement... 2026-03-09T21:56:27.453 INFO:tasks.workunit.client.0.vm11.stdout:Deploying crash service with default placement... 2026-03-09T21:56:28.512 INFO:tasks.workunit.client.0.vm11.stdout:Enabling the dashboard module... 2026-03-09T21:56:30.173 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for the mgr to restart... 2026-03-09T21:56:30.174 INFO:tasks.workunit.client.0.vm11.stdout:Waiting for mgr epoch 9... 2026-03-09T21:56:34.063 INFO:tasks.workunit.client.0.vm11.stdout:mgr epoch 9 is available 2026-03-09T21:56:34.063 INFO:tasks.workunit.client.0.vm11.stdout:Generating a dashboard self-signed certificate... 2026-03-09T21:56:34.433 INFO:tasks.workunit.client.0.vm11.stdout:Creating initial admin user... 2026-03-09T21:56:34.886 INFO:tasks.workunit.client.0.vm11.stdout:Fetching dashboard port number... 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout:Ceph Dashboard is now available at: 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout: URL: https://vm11.local:8443/ 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout: User: admin 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout: Password: m4pypp31y9 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.166 INFO:tasks.workunit.client.0.vm11.stdout:Saving cluster configuration to /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/config directory 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: sudo /usr/sbin/cephadm shell --fsid 00000000-0000-0000-0000-0000deadbeef -c tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 -k tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout:Or, if you are only running a single cluster on this host: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: sudo /usr/sbin/cephadm shell 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: ceph telemetry on 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout:For more information see: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:56:35.475 INFO:tasks.workunit.client.0.vm11.stdout:Bootstrap complete. 2026-03-09T21:56:35.496 INFO:tasks.workunit.client.0.vm11.stderr:+ test -e tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 2026-03-09T21:56:35.496 INFO:tasks.workunit.client.0.vm11.stderr:+ test -e tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj 2026-03-09T21:56:35.496 INFO:tasks.workunit.client.0.vm11.stderr:+ rm -f tmp.test_cephadm.sh.Qa6KUG/tmp.uLgnMiIMIl 2026-03-09T21:56:35.497 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo test -e /var/log/ceph/00000000-0000-0000-0000-0000deadbeef/ceph-mon.a.log 2026-03-09T21:56:35.504 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo test -e /var/log/ceph/00000000-0000-0000-0000-0000deadbeef/ceph-mgr.x.log 2026-03-09T21:56:35.509 INFO:tasks.workunit.client.0.vm11.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-09T21:56:35.509 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-enabled ceph.target 2026-03-09T21:56:35.511 INFO:tasks.workunit.client.0.vm11.stdout:enabled 2026-03-09T21:56:35.511 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-active ceph.target 2026-03-09T21:56:35.513 INFO:tasks.workunit.client.0.vm11.stdout:active 2026-03-09T21:56:35.513 INFO:tasks.workunit.client.0.vm11.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-09T21:56:35.513 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef.target 2026-03-09T21:56:35.514 INFO:tasks.workunit.client.0.vm11.stdout:enabled 2026-03-09T21:56:35.514 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef.target 2026-03-09T21:56:35.516 INFO:tasks.workunit.client.0.vm11.stdout:active 2026-03-09T21:56:35.516 INFO:tasks.workunit.client.0.vm11.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-09T21:56:35.516 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-09T21:56:35.518 INFO:tasks.workunit.client.0.vm11.stdout:enabled 2026-03-09T21:56:35.518 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-09T21:56:35.519 INFO:tasks.workunit.client.0.vm11.stdout:active 2026-03-09T21:56:35.519 INFO:tasks.workunit.client.0.vm11.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-09T21:56:35.519 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mgr.x 2026-03-09T21:56:35.521 INFO:tasks.workunit.client.0.vm11.stdout:enabled 2026-03-09T21:56:35.521 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mgr.x 2026-03-09T21:56:35.522 INFO:tasks.workunit.client.0.vm11.stdout:active 2026-03-09T21:56:35.523 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl 2026-03-09T21:56:35.523 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q .slice 2026-03-09T21:56:35.524 INFO:tasks.workunit.client.0.vm11.stderr:+ grep system-ceph 2026-03-09T21:56:35.526 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph -s 2026-03-09T21:56:35.526 INFO:tasks.workunit.client.0.vm11.stderr:+ grep 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:56:35.994 INFO:tasks.workunit.client.0.vm11.stdout: id: 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:56:35.994 INFO:tasks.workunit.client.0.vm11.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-09T21:56:35.994 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch apply mon --unmanaged 2026-03-09T21:56:36.585 INFO:tasks.workunit.client.0.vm11.stdout:Scheduled mon update... 2026-03-09T21:56:36.646 INFO:tasks.workunit.client.0.vm11.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-09T21:56:36.646 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch apply mgr --unmanaged 2026-03-09T21:56:36.966 INFO:tasks.workunit.client.0.vm11.stdout:Scheduled mgr update... 2026-03-09T21:56:37.025 INFO:tasks.workunit.client.0.vm11.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-09T21:56:37.025 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch apply node-exporter --unmanaged 2026-03-09T21:56:37.397 INFO:tasks.workunit.client.0.vm11.stdout:Scheduled node-exporter update... 2026-03-09T21:56:37.461 INFO:tasks.workunit.client.0.vm11.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-09T21:56:37.461 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch apply prometheus --unmanaged 2026-03-09T21:56:37.866 INFO:tasks.workunit.client.0.vm11.stdout:Scheduled prometheus update... 2026-03-09T21:56:37.940 INFO:tasks.workunit.client.0.vm11.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-09T21:56:37.940 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch apply grafana --unmanaged 2026-03-09T21:56:38.409 INFO:tasks.workunit.client.0.vm11.stdout:Scheduled grafana update... 2026-03-09T21:56:38.487 INFO:tasks.workunit.client.0.vm11.stderr:+ grep 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:56:38.488 INFO:tasks.workunit.client.0.vm11.stderr:+ jq 'select(.name == "mon.a").fsid' 2026-03-09T21:56:38.488 INFO:tasks.workunit.client.0.vm11.stderr:+ jq '.[]' 2026-03-09T21:56:38.488 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-09T21:56:43.349 INFO:tasks.workunit.client.0.vm11.stdout:"00000000-0000-0000-0000-0000deadbeef" 2026-03-09T21:56:43.349 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-09T21:56:43.349 INFO:tasks.workunit.client.0.vm11.stderr:+ grep 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:56:43.350 INFO:tasks.workunit.client.0.vm11.stderr:+ jq '.[]' 2026-03-09T21:56:43.351 INFO:tasks.workunit.client.0.vm11.stderr:+ jq 'select(.name == "mgr.x").fsid' 2026-03-09T21:56:47.291 INFO:tasks.workunit.client.0.vm11.stdout:"00000000-0000-0000-0000-0000deadbeef" 2026-03-09T21:56:47.291 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-09T21:56:47.291 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q '\.' 2026-03-09T21:56:47.292 INFO:tasks.workunit.client.0.vm11.stderr:+ jq 'select(.name == "mon.a").version' 2026-03-09T21:56:47.299 INFO:tasks.workunit.client.0.vm11.stderr:+ jq '.[]' 2026-03-09T21:56:51.316 INFO:tasks.workunit.client.0.vm11.stderr:+ cp tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 tmp.test_cephadm.sh.Qa6KUG/tmp.wkX7WVXszC 2026-03-09T21:56:51.317 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'public addrv = [v2:127.0.0.1:3301,v1:127.0.0.1:6790]' 2026-03-09T21:56:51.318 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name mon.b --arg keyring /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/keyring --arg config tmp.test_cephadm.sh.Qa6KUG/tmp.wkX7WVXszC '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config}}' 2026-03-09T21:56:51.318 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-09T21:56:51.435 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mon-b 2026-03-09T21:56:51.435 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:56:51.435 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mon-b 2026-03-09T21:56:51.444 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mon.b 2026-03-09T21:56:51.444 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:56:51.444 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mon.b 2026-03-09T21:56:51.444 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon mon.b ... 2026-03-09T21:56:52.094 INFO:tasks.workunit.client.0.vm11.stderr:+ for u in ceph-$FSID@mon.b 2026-03-09T21:56:52.094 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mon.b 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stdout:enabled 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stdout:active 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mon.b 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph mon stat | grep '\''2 mons'\''' 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available mon.b 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph mon stat | grep '\''2 mons'\''' 30 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=mon.b 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph mon stat | grep '\''2 mons'\''' 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=30 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:56:52.099 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph mon stat | grep '\''2 mons'\''' 2026-03-09T21:56:52.100 INFO:tasks.workunit.client.0.vm11.stderr:++ grep '2 mons' 2026-03-09T21:56:52.100 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph mon stat 2026-03-09T21:56:58.601 INFO:tasks.workunit.client.0.vm11.stdout:e2: 2 mons at {a=[v2:127.0.0.1:3300/0,v1:127.0.0.1:6789/0],b=[v2:127.0.0.1:3301/0,v1:127.0.0.1:6790/0]} removed_ranks: {} disallowed_leaders: {}, election epoch 10, leader 0 a, quorum 0,1 a,b 2026-03-09T21:56:58.601 INFO:tasks.workunit.client.0.vm11.stdout:mon.b is available 2026-03-09T21:56:58.601 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'mon.b is available' 2026-03-09T21:56:58.601 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:56:58.601 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph auth get-or-create mgr.y mon 'allow profile mgr' osd 'allow *' mds 'allow *' 2026-03-09T21:56:59.066 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name mgr.y --arg keyring tmp.test_cephadm.sh.Qa6KUG/keyring.mgr.y --arg config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config}}' 2026-03-09T21:56:59.066 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-09T21:56:59.192 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mgr-y 2026-03-09T21:56:59.192 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:56:59.192 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mgr-y 2026-03-09T21:56:59.204 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mgr.y 2026-03-09T21:56:59.204 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:56:59.204 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mgr.y 2026-03-09T21:56:59.204 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon mgr.y ... 2026-03-09T21:56:59.750 INFO:tasks.workunit.client.0.vm11.stderr:+ for u in ceph-$FSID@mgr.y 2026-03-09T21:56:59.750 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mgr.y 2026-03-09T21:56:59.752 INFO:tasks.workunit.client.0.vm11.stdout:enabled 2026-03-09T21:56:59.752 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mgr.y 2026-03-09T21:56:59.753 INFO:tasks.workunit.client.0.vm11.stdout:active 2026-03-09T21:56:59.754 INFO:tasks.workunit.client.0.vm11.stderr:++ seq 1 30 2026-03-09T21:56:59.754 INFO:tasks.workunit.client.0.vm11.stderr:+ for f in `seq 1 30` 2026-03-09T21:56:59.755 INFO:tasks.workunit.client.0.vm11.stderr:+ jq .mgrmap.num_standbys 2026-03-09T21:56:59.755 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q 1 2026-03-09T21:56:59.755 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph -s -f json-pretty 2026-03-09T21:57:00.304 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 1 2026-03-09T21:57:01.308 INFO:tasks.workunit.client.0.vm11.stderr:+ for f in `seq 1 30` 2026-03-09T21:57:01.309 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q 1 2026-03-09T21:57:01.310 INFO:tasks.workunit.client.0.vm11.stderr:+ jq .mgrmap.num_standbys 2026-03-09T21:57:01.316 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph -s -f json-pretty 2026-03-09T21:57:01.864 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 1 2026-03-09T21:57:02.865 INFO:tasks.workunit.client.0.vm11.stderr:+ for f in `seq 1 30` 2026-03-09T21:57:02.865 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q 1 2026-03-09T21:57:02.866 INFO:tasks.workunit.client.0.vm11.stderr:+ jq .mgrmap.num_standbys 2026-03-09T21:57:02.867 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph -s -f json-pretty 2026-03-09T21:57:03.429 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 1 2026-03-09T21:57:04.431 INFO:tasks.workunit.client.0.vm11.stderr:+ for f in `seq 1 30` 2026-03-09T21:57:04.431 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph -s -f json-pretty 2026-03-09T21:57:04.431 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q 1 2026-03-09T21:57:04.431 INFO:tasks.workunit.client.0.vm11.stderr:+ jq .mgrmap.num_standbys 2026-03-09T21:57:04.880 INFO:tasks.workunit.client.0.vm11.stderr:+ break 2026-03-09T21:57:04.880 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph -s -f json-pretty 2026-03-09T21:57:04.880 INFO:tasks.workunit.client.0.vm11.stderr:+ grep -q 1 2026-03-09T21:57:04.881 INFO:tasks.workunit.client.0.vm11.stderr:+ jq .mgrmap.num_standbys 2026-03-09T21:57:05.331 INFO:tasks.workunit.client.0.vm11.stderr:+ dd if=/dev/zero of=tmp.test_cephadm.sh.Qa6KUG/test_cephadm_osd.img bs=1 count=0 seek=6G 2026-03-09T21:57:05.332 INFO:tasks.workunit.client.0.vm11.stderr:0+0 records in 2026-03-09T21:57:05.332 INFO:tasks.workunit.client.0.vm11.stderr:0+0 records out 2026-03-09T21:57:05.332 INFO:tasks.workunit.client.0.vm11.stderr:0 bytes copied, 7.5501e-05 s, 0.0 kB/s 2026-03-09T21:57:05.332 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo losetup -f 2026-03-09T21:57:05.339 INFO:tasks.workunit.client.0.vm11.stderr:+ loop_dev=/dev/loop3 2026-03-09T21:57:05.340 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo vgremove -f test_cephadm 2026-03-09T21:57:05.353 INFO:tasks.workunit.client.0.vm11.stderr: Volume group "test_cephadm" not found 2026-03-09T21:57:05.353 INFO:tasks.workunit.client.0.vm11.stderr: Cannot process volume group test_cephadm 2026-03-09T21:57:05.381 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:57:05.381 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo losetup /dev/loop3 tmp.test_cephadm.sh.Qa6KUG/test_cephadm_osd.img 2026-03-09T21:57:05.400 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo pvcreate /dev/loop3 2026-03-09T21:57:05.427 INFO:tasks.workunit.client.0.vm11.stdout: Physical volume "/dev/loop3" successfully created. 2026-03-09T21:57:05.456 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo vgcreate test_cephadm /dev/loop3 2026-03-09T21:57:05.561 INFO:tasks.workunit.client.0.vm11.stdout: Volume group "test_cephadm" successfully created 2026-03-09T21:57:05.613 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph auth get client.bootstrap-osd 2026-03-09T21:57:06.081 INFO:tasks.workunit.client.0.vm11.stderr:++ seq 0 1 2026-03-09T21:57:06.082 INFO:tasks.workunit.client.0.vm11.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-09T21:57:06.082 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo lvcreate -l 50%VG -n test_cephadm.0 test_cephadm 2026-03-09T21:57:06.160 INFO:tasks.workunit.client.0.vm11.stdout: Logical volume "test_cephadm.0" created. 2026-03-09T21:57:06.197 INFO:tasks.workunit.client.0.vm11.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-09T21:57:06.197 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo lvcreate -l 50%VG -n test_cephadm.1 test_cephadm 2026-03-09T21:57:06.259 INFO:tasks.workunit.client.0.vm11.stdout: Logical volume "test_cephadm.1" created. 2026-03-09T21:57:06.293 INFO:tasks.workunit.client.0.vm11.stderr:++ seq 0 1 2026-03-09T21:57:06.294 INFO:tasks.workunit.client.0.vm11.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-09T21:57:06.294 INFO:tasks.workunit.client.0.vm11.stderr:+ device_name=/dev/test_cephadm/test_cephadm.0 2026-03-09T21:57:06.294 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPH_VOLUME='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd --' 2026-03-09T21:57:06.294 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd -- lvm prepare --bluestore --data /dev/test_cephadm/test_cephadm.0 --no-systemd 2026-03-09T21:57:10.187 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:57:10.204 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd -- lvm batch --no-auto /dev/test_cephadm/test_cephadm.0 --yes --no-systemd 2026-03-09T21:57:10.775 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:57:10.792 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd -- lvm list --format json /dev/test_cephadm/test_cephadm.0 2026-03-09T21:57:11.353 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -cr '.. | ."ceph.osd_id"? | select(.)' 2026-03-09T21:57:11.353 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo cat tmp.test_cephadm.sh.Qa6KUG/osd.map 2026-03-09T21:57:11.364 INFO:tasks.workunit.client.0.vm11.stderr:+ osd_id=0 2026-03-09T21:57:11.365 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo cat tmp.test_cephadm.sh.Qa6KUG/osd.map 2026-03-09T21:57:11.365 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -cr '.. | ."ceph.osd_fsid"? | select(.)' 2026-03-09T21:57:11.376 INFO:tasks.workunit.client.0.vm11.stderr:+ osd_fsid=68e218b3-1ce5-4be1-969e-3c57242ad716 2026-03-09T21:57:11.376 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name osd.0 --arg keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd --arg config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --arg osd_fsid 68e218b3-1ce5-4be1-969e-3c57242ad716 '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config, "osd_fsid": $osd_fsid}}' 2026-03-09T21:57:11.376 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-09T21:57:11.492 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd-0 2026-03-09T21:57:11.492 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:11.492 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd-0 2026-03-09T21:57:11.503 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd.0 2026-03-09T21:57:11.503 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:11.503 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd.0 2026-03-09T21:57:11.503 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon osd.0 ... 2026-03-09T21:57:12.682 INFO:tasks.workunit.client.0.vm11.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-09T21:57:12.682 INFO:tasks.workunit.client.0.vm11.stderr:+ device_name=/dev/test_cephadm/test_cephadm.1 2026-03-09T21:57:12.682 INFO:tasks.workunit.client.0.vm11.stderr:+ CEPH_VOLUME='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd --' 2026-03-09T21:57:12.682 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd -- lvm prepare --bluestore --data /dev/test_cephadm/test_cephadm.1 --no-systemd 2026-03-09T21:57:16.898 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:57:16.913 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd -- lvm batch --no-auto /dev/test_cephadm/test_cephadm.1 --yes --no-systemd 2026-03-09T21:57:17.440 INFO:tasks.workunit.client.0.vm11.stdout: 2026-03-09T21:57:17.456 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd -- lvm list --format json /dev/test_cephadm/test_cephadm.1 2026-03-09T21:57:18.210 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo cat tmp.test_cephadm.sh.Qa6KUG/osd.map 2026-03-09T21:57:18.210 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -cr '.. | ."ceph.osd_id"? | select(.)' 2026-03-09T21:57:18.222 INFO:tasks.workunit.client.0.vm11.stderr:+ osd_id=1 2026-03-09T21:57:18.222 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo cat tmp.test_cephadm.sh.Qa6KUG/osd.map 2026-03-09T21:57:18.222 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -cr '.. | ."ceph.osd_fsid"? | select(.)' 2026-03-09T21:57:18.239 INFO:tasks.workunit.client.0.vm11.stderr:+ osd_fsid=bb06727c-bb8c-449a-94e9-fc38e8f46084 2026-03-09T21:57:18.239 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name osd.1 --arg keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd --arg config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --arg osd_fsid bb06727c-bb8c-449a-94e9-fc38e8f46084 '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config, "osd_fsid": $osd_fsid}}' 2026-03-09T21:57:18.239 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-09T21:57:18.367 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd-1 2026-03-09T21:57:18.367 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:18.367 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd-1 2026-03-09T21:57:18.378 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd.1 2026-03-09T21:57:18.378 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:18.378 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd.1 2026-03-09T21:57:18.378 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon osd.1 ... 2026-03-09T21:57:19.705 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name node-exporter.a '{"fsid": $fsid, "name": $name}' 2026-03-09T21:57:19.708 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-09T21:57:19.896 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter-a 2026-03-09T21:57:19.896 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:19.896 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter-a 2026-03-09T21:57:19.913 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter.a 2026-03-09T21:57:19.913 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:19.913 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter.a 2026-03-09T21:57:19.913 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon node-exporter.a ... 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available node-exporter 'curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 10 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=node-exporter 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=10 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:++ grep -q 'Node Exporter' 2026-03-09T21:57:20.496 INFO:tasks.workunit.client.0.vm11.stderr:++ curl http://localhost:9100 2026-03-09T21:57:20.516 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:57:20.522 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:57:20.522 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 2026-03-09T21:57:20.522 INFO:tasks.workunit.client.0.vm11.stderr:curl: (7) Failed to connect to localhost port 9100 after 2 ms: Connection refused 2026-03-09T21:57:20.523 INFO:tasks.workunit.client.0.vm11.stderr:+ num=1 2026-03-09T21:57:20.523 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' 1 -ge 10 ']' 2026-03-09T21:57:20.523 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 5 2026-03-09T21:57:25.525 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-09T21:57:25.525 INFO:tasks.workunit.client.0.vm11.stderr:++ curl http://localhost:9100 2026-03-09T21:57:25.525 INFO:tasks.workunit.client.0.vm11.stderr:++ grep -q 'Node Exporter' 2026-03-09T21:57:25.529 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:57:25.529 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:57:25.529 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 150 100 150 0 0 235k 0 --:--:-- --:--:-- --:--:-- 146k 2026-03-09T21:57:25.530 INFO:tasks.workunit.client.0.vm11.stdout:node-exporter is available 2026-03-09T21:57:25.530 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'node-exporter is available' 2026-03-09T21:57:25.530 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:57:25.530 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-09T21:57:25.530 INFO:tasks.workunit.client.0.vm11.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/prometheus.json 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name prometheus.a --argjson config_blobs '{ 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: "files": { 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: "prometheus.yml": [ 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: "global:", 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: " scrape_interval: 5s", 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: " evaluation_interval: 10s", 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: "", 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: "rule_files: ", 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: " - '\''/etc/prometheus/alerting/*'\''", 2026-03-09T21:57:25.531 INFO:tasks.workunit.client.0.vm11.stderr: "", 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr: "scrape_configs:", 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr: " - job_name: '\''prometheus'\''", 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr: " static_configs:", 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr: " - targets: ['\''localhost:9095'\'']" 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr: ] 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr: } 2026-03-09T21:57:25.532 INFO:tasks.workunit.client.0.vm11.stderr:}' '{"fsid": $fsid, "name": $name, "config_blobs": $config_blobs}' 2026-03-09T21:57:25.703 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-prometheus-a 2026-03-09T21:57:25.703 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:25.703 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-prometheus-a 2026-03-09T21:57:25.724 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-prometheus.a 2026-03-09T21:57:25.725 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:25.725 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-prometheus.a 2026-03-09T21:57:25.725 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon prometheus.a ... 2026-03-09T21:57:31.506 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available prometheus 'curl '\''localhost:9095/api/v1/query?query=up'\''' 10 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=prometheus 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=10 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-09T21:57:31.507 INFO:tasks.workunit.client.0.vm11.stderr:++ curl 'localhost:9095/api/v1/query?query=up' 2026-03-09T21:57:31.515 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:57:31.523 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:57:31.523 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 2026-03-09T21:57:31.523 INFO:tasks.workunit.client.0.vm11.stderr:curl: (7) Failed to connect to localhost port 9095 after 3 ms: Connection refused 2026-03-09T21:57:31.523 INFO:tasks.workunit.client.0.vm11.stderr:+ num=1 2026-03-09T21:57:31.523 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' 1 -ge 10 ']' 2026-03-09T21:57:31.523 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 5 2026-03-09T21:57:36.519 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-09T21:57:36.519 INFO:tasks.workunit.client.0.vm11.stderr:++ curl 'localhost:9095/api/v1/query?query=up' 2026-03-09T21:57:36.522 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:57:36.522 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:57:36.523 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 63 100 63 0 0 70155 0 --:--:-- --:--:-- --:--:-- 63000 2026-03-09T21:57:36.524 INFO:tasks.workunit.client.0.vm11.stdout:{"status":"success","data":{"resultType":"vector","result":[]}}prometheus is available 2026-03-09T21:57:36.524 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'prometheus is available' 2026-03-09T21:57:36.524 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:57:36.524 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-09T21:57:36.524 INFO:tasks.workunit.client.0.vm11.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/grafana.json 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name grafana.a --argjson config_blobs '{ 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "files": { 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "grafana.ini": [ 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "[users]", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " default_theme = light", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "[auth.anonymous]", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " enabled = true", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " org_name = '\''Main Org.'\''", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " org_role = '\''Viewer'\''", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "[server]", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " domain = '\''bootstrap.storage.lab'\''", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " protocol = https", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " cert_file = /etc/grafana/certs/cert_file", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " cert_key = /etc/grafana/certs/cert_key", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " http_port = 3000", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " http_addr = localhost", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "[security]", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " admin_user = admin", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " admin_password = admin", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " allow_embedding = true" 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: ], 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "provisioning/datasources/ceph-dashboard.yml": [ 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: "deleteDatasources:", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " - name: '\''Dashboard'\''", 2026-03-09T21:57:36.526 INFO:tasks.workunit.client.0.vm11.stderr: " orgId: 1", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " ", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "datasources:", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " - name: '\''Dashboard'\''", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " type: '\''prometheus'\''", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " access: '\''proxy'\''", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " orgId: 1", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " url: '\''http://localhost:9095'\''", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " basicAuth: false", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " isDefault: true", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: " editable: false" 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: ], 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "certs/cert_file": [ 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "-----BEGIN CERTIFICATE-----", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "MIIDLTCCAhWgAwIBAgIUEH0mq6u93LKsWlNXst5pxWcuqkQwDQYJKoZIhvcNAQEL", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "BQAwJjELMAkGA1UECgwCSVQxFzAVBgNVBAMMDmNlcGgtZGFzaGJvYXJkMB4XDTIw", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "MDEwNTIyNDYyMFoXDTMwMDEwMjIyNDYyMFowJjELMAkGA1UECgwCSVQxFzAVBgNV", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "BAMMDmNlcGgtZGFzaGJvYXJkMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKC", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "AQEAqxh6eO0NTZJe+DoKZG/kozJCf+83eB3gWzwXoNinRmV/49f5WPR20DIxAe0R", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "saO6XynJXTrhvXT1bsARUq+LSmjWNFoYXopFuOJhGdWn4dmpuHwtpcFv2kjzNOKj", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "U2EG8j6bsRp1jFAzn7kdbSWT0UHySRXp9DPAjDiF3LjykMXiJMReccFXrB1pRi93", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "nJxED8d6oT5GazGB44svb+Zi6ABamZu5SDJC1Fr/O5rWFNQkH4hQEqDPj1817H9O", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "sm0mZiNy77ZQuAzOgZN153L3QOsyJismwNHfAMGMH9mzPKOjyhc13VlZyeEzml8p", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "ZpWQ2gi8P2r/FAr8bFL3MFnHKwIDAQABo1MwUTAdBgNVHQ4EFgQUZg3v7MX4J+hx", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "w3HENCrUkMK8tbwwHwYDVR0jBBgwFoAUZg3v7MX4J+hxw3HENCrUkMK8tbwwDwYD", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "VR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEAaR/XPGKwUgVwH3KXAb6+", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "s9NTAt6lCmFdQz1ngoqFSizW7KGSXnOgd6xTiUCR0Tjjo2zKCwhIINaI6mwqMbrg", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "BOjb7diaqwFaitRs27AtdmaqMGndUqEBUn/k64Ld3VPGL4p0W2W+tXsyzZg1qQIn", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "JXb7c4+oWzXny7gHFheYQTwnHzDcNOf9vJiMGyYYvU1xTOGucu6dwtOVDDe1Z4Nq", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "AyIYWDScRr2FeAOXyx4aW2v5bjpTxvP+79/OOBbQ+p4y5F4PDrPeOSweGoo6huTR", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "+T+YI9Jfw2XCgV7NHWhfdt3fHHwUQzO6WszWU557pmCODLvXWsQ8P+GRiG7Nywm3", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "uA==", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "-----END CERTIFICATE-----" 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: ], 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "certs/cert_key": [ 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "-----BEGIN PRIVATE KEY-----", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "MIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQCrGHp47Q1Nkl74", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "Ogpkb+SjMkJ/7zd4HeBbPBeg2KdGZX/j1/lY9HbQMjEB7RGxo7pfKcldOuG9dPVu", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "wBFSr4tKaNY0WhheikW44mEZ1afh2am4fC2lwW/aSPM04qNTYQbyPpuxGnWMUDOf", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "uR1tJZPRQfJJFen0M8CMOIXcuPKQxeIkxF5xwVesHWlGL3ecnEQPx3qhPkZrMYHj", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "iy9v5mLoAFqZm7lIMkLUWv87mtYU1CQfiFASoM+PXzXsf06ybSZmI3LvtlC4DM6B", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "k3XncvdA6zImKybA0d8AwYwf2bM8o6PKFzXdWVnJ4TOaXylmlZDaCLw/av8UCvxs", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "UvcwWccrAgMBAAECggEAeBv0BiYrm5QwdUORfhaKxAIJavRM1Vbr5EBYOgM90o54", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "bEN2ePsM2XUSsE5ziGfu8tVL1dX7GNwdW8UbpBc1ymO0VAYXa27YKUVKcy9o7oS1", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "v5v1E5Kq6esiSLL9gw/vJ2nKNFblxD2dL/hs7u1dSp5n7uSiW1tlRUp8toljRzts", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "1Cenp0J/a82HwWDE8j/H9NvitTOZ2cdwJ76V8GkBynlvr2ARjRfZGx0WXEJmoZYD", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "YUQVU303DB6Q2tkFco4LbPofkuhhMPhXsz3fZ/blHj/c78tqP9L5sQ29oqoPE1pS", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "DBOwKC/eoi5FY34RdLNL0dKq9MzbuYqEcCfZOJgxoQKBgQDf+5XF+aXQz2OmSaj6", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "1Yr+3KAKdfX/AYp22X1Wy4zWcZlgujgwQ1FG0zay8HVBM0/xn4UgOtcKCoXibePh", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "ag1t8aZINdRE1JcMzKmZoSvU9Xk30CNvygizuJVEKsJFPDbPzCpauDSplzcQb4pZ", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "wepucPuowkPMBx0iU3x0qSThWwKBgQDDjYs7d30xxSqWWXyCOZshy7UtHMNfqP15", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "kDfTXIZzuHvDf6ZNci10VY1eDZbpZfHgc6x1ElbKv2H4dYsgkENJZUi1YQDpVPKq", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "4N5teNykgAuagiR7dRFltSju3S7hIE6HInTv3hShaFPymlEE7zuBMuEUcuvYz5YN", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "RjxsvypKcQKBgCuuV+Y1KqZPW8K5SNAqRyIvCrMfkCr8NPG6tpvvtHa5zsyzZHPd", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "HQOv+1HoXSWrCSM5FfBUKU3XAYdIIRH76cSQRPp+LPiDcTXY0Baa/P5aJRrCZ7bM", 2026-03-09T21:57:36.527 INFO:tasks.workunit.client.0.vm11.stderr: "cugBznJt2FdCR/o8eeIZXIPabq2w4w1gKQUC2cFuqWQn2wGvwGzL89pTAoGAAfpx", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "mSVpT9KVzrWTC+I3To04BP/QfixAfDVYSzwZZBxOrDijXw8zpISlDHmIuE2+t62T", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "5g9Mb3qmLBRMVwT+mUR8CtGzZ6jjV5U0yti5KrTc6TA93D3f8i51/oygR8jC4p0X", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "n8GYZdWfW8nx3eHpsTHpkwJinmvjMbkvLU51yBECgYAnUAMyhNOWjbYS5QWd8i1W", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "SFQansVDeeT98RebrzmGwlgrCImHItJz0Tz8gkNB3+S2B2balqT0WHaDxQ8vCtwX", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "xB4wd+gMomgdYtHGRnRwj1UyRXDk0c1TgGdRjOn3URaezBMibHTQSbFgPciJgAuU", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "mEl75h1ToBX9yvnH39o50g==", 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: "-----END PRIVATE KEY-----" 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: ] 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr: } 2026-03-09T21:57:36.528 INFO:tasks.workunit.client.0.vm11.stderr:}' '{"fsid": $fsid, "name": $name, "config_blobs": $config_blobs}' 2026-03-09T21:57:36.647 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-grafana-a 2026-03-09T21:57:36.647 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:36.648 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-grafana-a 2026-03-09T21:57:36.657 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-grafana.a 2026-03-09T21:57:36.657 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:36.657 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-grafana.a 2026-03-09T21:57:36.657 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon grafana.a ... 2026-03-09T21:57:45.858 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-09T21:57:45.858 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available grafana 'curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 50 2026-03-09T21:57:45.858 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=grafana 2026-03-09T21:57:45.859 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-09T21:57:45.859 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=50 2026-03-09T21:57:45.859 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:57:45.859 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-09T21:57:45.859 INFO:tasks.workunit.client.0.vm11.stderr:++ curl --insecure https://localhost:3000 2026-03-09T21:57:45.859 INFO:tasks.workunit.client.0.vm11.stderr:++ grep -q grafana 2026-03-09T21:57:45.863 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:57:45.863 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:57:45.863 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 2026-03-09T21:57:45.864 INFO:tasks.workunit.client.0.vm11.stderr:curl: (7) Failed to connect to localhost port 3000 after 0 ms: Connection refused 2026-03-09T21:57:45.864 INFO:tasks.workunit.client.0.vm11.stderr:+ num=1 2026-03-09T21:57:45.864 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' 1 -ge 50 ']' 2026-03-09T21:57:45.864 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 5 2026-03-09T21:57:50.865 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-09T21:57:50.865 INFO:tasks.workunit.client.0.vm11.stderr:++ curl --insecure https://localhost:3000 2026-03-09T21:57:50.870 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:57:50.870 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:57:50.870 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0++ grep -q grafana 2026-03-09T21:57:50.882 INFO:tasks.workunit.client.0.vm11.stderr: 100 43313 0 43313 0 0 3232k 0 --:--:-- --:--:-- --:--:-- 3524k 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stdout:grafana is available 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'grafana is available' 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stderr:+ nfs_stop 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'units=nfs-server nfs-kernel-server' 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stderr:+ for unit in $units 2026-03-09T21:57:50.884 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl --no-pager status nfs-server 2026-03-09T21:57:50.889 INFO:tasks.workunit.client.0.vm11.stderr:+ for unit in $units 2026-03-09T21:57:50.889 INFO:tasks.workunit.client.0.vm11.stderr:+ systemctl --no-pager status nfs-kernel-server 2026-03-09T21:57:50.892 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep LISTEN' 2026-03-09T21:57:50.892 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:57:50.892 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep LISTEN' 2026-03-09T21:57:50.892 INFO:tasks.workunit.client.0.vm11.stderr:++ grep LISTEN 2026-03-09T21:57:50.893 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo ss -tlnp '( sport = :nfs )' 2026-03-09T21:57:50.910 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:57:50.910 INFO:tasks.workunit.client.0.vm11.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/nfs.json 2026-03-09T21:57:50.911 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -r '.["pool"]' 2026-03-09T21:57:50.923 INFO:tasks.workunit.client.0.vm11.stderr:+ nfs_rados_pool=nfs-ganesha 2026-03-09T21:57:50.923 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph osd pool create nfs-ganesha 64 2026-03-09T21:57:52.195 INFO:tasks.workunit.client.0.vm11.stderr:pool 'nfs-ganesha' created 2026-03-09T21:57:52.280 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- rados --pool nfs-ganesha --namespace nfs-ns create conf-nfs.a 2026-03-09T21:57:53.492 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch pause 2026-03-09T21:57:54.089 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-09T21:57:54.089 INFO:tasks.workunit.client.0.vm11.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/nfs.json 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name nfs.a --arg keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj --arg config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --argjson config_blobs '{ 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "pool" : "nfs-ganesha", 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "namespace" : "nfs-ns", 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "files": { 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "ganesha.conf": [ 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "RADOS_URLS {", 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: " userid = admin;", 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "}", 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "", 2026-03-09T21:57:54.091 INFO:tasks.workunit.client.0.vm11.stderr: "%url rados://nfs-ganesha/nfs-ns/conf-nfs.a", 2026-03-09T21:57:54.092 INFO:tasks.workunit.client.0.vm11.stderr: "" 2026-03-09T21:57:54.092 INFO:tasks.workunit.client.0.vm11.stderr: ], 2026-03-09T21:57:54.092 INFO:tasks.workunit.client.0.vm11.stderr: "idmap.conf": "" 2026-03-09T21:57:54.092 INFO:tasks.workunit.client.0.vm11.stderr: } 2026-03-09T21:57:54.092 INFO:tasks.workunit.client.0.vm11.stderr:}' '{"fsid": $fsid, "name": $name, "params": {"keyring": $keyring, "config": $config}, "config_blobs": $config_blobs}' 2026-03-09T21:57:54.203 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-nfs-a 2026-03-09T21:57:54.203 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:54.203 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-nfs-a 2026-03-09T21:57:54.214 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-nfs.a 2026-03-09T21:57:54.214 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:57:54.214 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-nfs.a 2026-03-09T21:57:54.214 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon nfs.a ... 2026-03-09T21:57:54.322 INFO:tasks.workunit.client.0.vm11.stderr:Verifying port 0.0.0.0:2049 ... 2026-03-09T21:57:54.323 INFO:tasks.workunit.client.0.vm11.stderr:Creating ganesha config... 2026-03-09T21:57:54.323 INFO:tasks.workunit.client.0.vm11.stderr:Write file: /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/nfs.a/etc/ganesha/ganesha.conf 2026-03-09T21:57:54.325 INFO:tasks.workunit.client.0.vm11.stderr:Write file: /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/nfs.a/etc/ganesha/idmap.conf 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available nfs 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 10 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=nfs 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=10 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:++ grep ganesha.nfsd 2026-03-09T21:57:54.778 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo ss -tlnp '( sport = :nfs )' 2026-03-09T21:57:54.798 INFO:tasks.workunit.client.0.vm11.stderr:+ num=1 2026-03-09T21:57:54.798 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' 1 -ge 10 ']' 2026-03-09T21:57:54.798 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 5 2026-03-09T21:57:59.801 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-09T21:57:59.801 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo ss -tlnp '( sport = :nfs )' 2026-03-09T21:57:59.801 INFO:tasks.workunit.client.0.vm11.stderr:++ grep ganesha.nfsd 2026-03-09T21:57:59.816 INFO:tasks.workunit.client.0.vm11.stdout:LISTEN 0 4096 *:2049 *:* users:(("ganesha.nfsd",pid=37363,fd=24)) 2026-03-09T21:57:59.823 INFO:tasks.workunit.client.0.vm11.stdout:nfs is available 2026-03-09T21:57:59.823 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'nfs is available' 2026-03-09T21:57:59.823 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:57:59.823 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --keyring tmp.test_cephadm.sh.Qa6KUG/tmp.4oVbVxOdmj -- ceph orch resume 2026-03-09T21:58:00.340 INFO:tasks.workunit.client.0.vm11.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/custom_container.json 2026-03-09T21:58:00.340 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -r .image 2026-03-09T21:58:00.350 INFO:tasks.workunit.client.0.vm11.stderr:+ alertmanager_image=quay.io/prometheus/alertmanager:v0.20.0 2026-03-09T21:58:00.350 INFO:tasks.workunit.client.0.vm11.stderr:++ jq .ports /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/custom_container.json 2026-03-09T21:58:00.360 INFO:tasks.workunit.client.0.vm11.stderr:+ tcp_ports='[ 2026-03-09T21:58:00.360 INFO:tasks.workunit.client.0.vm11.stderr: 9093, 2026-03-09T21:58:00.360 INFO:tasks.workunit.client.0.vm11.stderr: 9094 2026-03-09T21:58:00.360 INFO:tasks.workunit.client.0.vm11.stderr:]' 2026-03-09T21:58:00.360 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-09T21:58:00.360 INFO:tasks.workunit.client.0.vm11.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/custom_container.json 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name container.alertmanager.a --arg keyring tmp.test_cephadm.sh.Qa6KUG/keyring.bootstrap.osd --arg config tmp.test_cephadm.sh.Qa6KUG/tmp.gncoCPU6Q5 --arg image quay.io/prometheus/alertmanager:v0.20.0 --argjson tcp_ports '[ 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: 9093, 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: 9094 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr:]' --argjson config_blobs '{ 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "image": "quay.io/prometheus/alertmanager:v0.20.0", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "ports": [9093, 9094], 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "args": [ 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "-p", "9093:9093", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "-p", "9094:9094" 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: ], 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "dirs": ["etc/alertmanager"], 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "files": { 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "etc/alertmanager/alertmanager.yml": [ 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "global:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " resolve_timeout: 5m", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "route:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " group_by: ['\''alertname'\'']", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " group_wait: 10s", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " group_interval: 10s", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " repeat_interval: 1h", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " receiver: '\''web.hook'\''", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "receivers:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "- name: '\''web.hook'\''", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " webhook_configs:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " - url: '\''http://127.0.0.1:5001/'\''", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "inhibit_rules:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " - source_match:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " severity: '\''critical'\''", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " target_match:", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " severity: '\''warning'\''", 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: " equal: ['\''alertname'\'', '\''dev'\'', '\''instance'\'']" 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: ] 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: }, 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "volume_mounts": { 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: "etc/alertmanager": "/etc/alertmanager" 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr: } 2026-03-09T21:58:00.361 INFO:tasks.workunit.client.0.vm11.stderr:}' '{"fsid": $fsid, "name": $name, "image": $image, "params": {"keyring": $keyring, "config": $config, "tcp_ports": $tcp_ports}, "config_blobs": $config_blobs}' 2026-03-09T21:58:00.473 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-09T21:58:00.473 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:58:00.473 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-09T21:58:00.484 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-09T21:58:00.484 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:58:00.484 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-09T21:58:00.484 INFO:tasks.workunit.client.0.vm11.stderr:Deploy daemon container.alertmanager.a ... 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Verifying port 0.0.0.0:9093 ... 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Verifying port 0.0.0.0:9094 ... 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Verifying port 0.0.0.0:9093 ... 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Verifying port 0.0.0.0:9094 ... 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Creating custom container configuration dirs/files in /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/container.alertmanager.a ... 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Creating directory: etc/alertmanager 2026-03-09T21:58:00.485 INFO:tasks.workunit.client.0.vm11.stderr:Creating file: etc/alertmanager/alertmanager.yml 2026-03-09T21:58:01.004 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available alertmanager.yml 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 10 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=alertmanager.yml 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=10 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-09T21:58:01.005 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml 2026-03-09T21:58:01.102 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/container.alertmanager.a/config 2026-03-09T21:58:01.110 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-09T21:58:01.110 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:58:01.110 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-09T21:58:01.119 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-09T21:58:01.119 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stdout 2026-03-09T21:58:01.119 INFO:tasks.workunit.client.0.vm11.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-09T21:58:01.119 INFO:tasks.workunit.client.0.vm11.stderr:ERROR: unable to find container "ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a" 2026-03-09T21:58:01.130 INFO:tasks.workunit.client.0.vm11.stderr:+ num=1 2026-03-09T21:58:01.130 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' 1 -ge 10 ']' 2026-03-09T21:58:01.130 INFO:tasks.workunit.client.0.vm11.stderr:+ sleep 5 2026-03-09T21:58:06.132 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-09T21:58:06.132 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml 2026-03-09T21:58:06.225 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/container.alertmanager.a/config 2026-03-09T21:58:06.291 INFO:tasks.workunit.client.0.vm11.stdout:alertmanager.yml is available 2026-03-09T21:58:06.291 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'alertmanager.yml is available' 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ cond='curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ is_available alertmanager 'curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 10 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ local name=alertmanager 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'condition=curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ local tries=10 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ local num=0 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:++ curl http://localhost:9093 2026-03-09T21:58:06.292 INFO:tasks.workunit.client.0.vm11.stderr:++ grep -q Alertmanager 2026-03-09T21:58:06.297 INFO:tasks.workunit.client.0.vm11.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-09T21:58:06.302 INFO:tasks.workunit.client.0.vm11.stderr: Dload Upload Total Spent Left Speed 2026-03-09T21:58:06.302 INFO:tasks.workunit.client.0.vm11.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1314 100 1314 0 0 409k 0 --:--:-- --:--:-- --:--:-- 427k 2026-03-09T21:58:06.302 INFO:tasks.workunit.client.0.vm11.stdout:alertmanager is available 2026-03-09T21:58:06.302 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'alertmanager is available' 2026-03-09T21:58:06.302 INFO:tasks.workunit.client.0.vm11.stderr:+ true 2026-03-09T21:58:06.302 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-09T21:58:06.389 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:10.948 INFO:tasks.workunit.client.0.vm11.stderr:stdout enabled 2026-03-09T21:58:10.961 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-active 2026-03-09T21:58:11.058 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:14.989 INFO:tasks.workunit.client.0.vm11.stderr:stdout active 2026-03-09T21:58:15.012 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.xyz -- is-active 2026-03-09T21:58:15.012 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:58:15.012 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.xyz -- is-active 2026-03-09T21:58:15.012 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.xyz -- is-active 2026-03-09T21:58:15.112 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.xyz/config 2026-03-09T21:58:19.028 INFO:tasks.workunit.client.0.vm11.stderr:ERROR: Daemon not found: mon.xyz. See `cephadm ls` 2026-03-09T21:58:19.043 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:58:19.043 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- disable 2026-03-09T21:58:19.138 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:23.307 INFO:tasks.workunit.client.0.vm11.stderr:stderr Removed /etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef.target.wants/ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service. 2026-03-09T21:58:23.320 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-09T21:58:23.320 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:58:23.320 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-09T21:58:23.320 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-09T21:58:23.417 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:27.112 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 1 from systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-09T21:58:27.112 INFO:tasks.workunit.client.0.vm11.stderr:stdout disabled 2026-03-09T21:58:27.128 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:58:27.128 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- enable 2026-03-09T21:58:27.217 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:31.379 INFO:tasks.workunit.client.0.vm11.stderr:stderr Created symlink /etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef.target.wants/ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service → /etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef@.service. 2026-03-09T21:58:31.395 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-09T21:58:31.492 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:35.184 INFO:tasks.workunit.client.0.vm11.stderr:stdout enabled 2026-03-09T21:58:35.195 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status 2026-03-09T21:58:35.289 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout ● ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service - Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Loaded: loaded (/etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef@.service; enabled; vendor preset: enabled) 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Active: active (running) since Mon 2026-03-09 21:56:12 UTC; 2min 26s ago 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Main PID: 21181 (bash) 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Tasks: 9 (limit: 9553) 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Memory: 8.8M 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout CPU: 66ms 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout CGroup: /system.slice/system-ceph\x2d00000000\x2d0000\x2d0000\x2d0000\x2d0000deadbeef.slice/ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout ├─21181 /bin/bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.run 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout └─21198 /usr/bin/docker run --rm --ipc=host --stop-signal=SIGTERM --ulimit nofile=1048576 --net=host --entrypoint /usr/bin/ceph-mon --privileged --group-add=disk --init --name ceph-00000000-0000-0000-0000-0000deadbeef-mon-a --pids-limit=0 -e CONTAINER_IMAGE=quay.ceph.io/ceph-ci/ceph:squid -e NODE_NAME=vm11 -e TCMALLOC_MAX_TOTAL_THREAD_CACHE_BYTES=134217728 -v /var/run/ceph/00000000-0000-0000-0000-0000deadbeef:/var/run/ceph:z -v /var/log/ceph/00000000-0000-0000-0000-0000deadbeef:/var/log/ceph:z -v /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/crash:/var/lib/ceph/crash:z -v /dev:/dev -v /run/udev:/run/udev -v /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a:/var/lib/ceph/mon/ceph-a:z -v /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config:/etc/ceph/ceph.conf:z quay.ceph.io/ceph-ci/ceph:squid -n mon.a -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-stderr=true "--default-log-stderr-prefix=debug " --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-stderr=true 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:31 vm11 bash[21198]: cluster 2026-03-09T21:58:30.280306+0000 mgr.x (mgr.14150) 80 : cluster [DBG] pgmap v69: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:31 vm11 bash[21198]: cluster 2026-03-09T21:58:30.280306+0000 mgr.x (mgr.14150) 80 : cluster [DBG] pgmap v69: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:32 vm11 bash[21198]: debug 2026-03-09T21:58:32.882+0000 7fd89faf3640 1 mon.a@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 2026-03-09T21:58:39.215 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:33 vm11 bash[21198]: cluster 2026-03-09T21:58:32.280656+0000 mgr.x (mgr.14150) 81 : cluster [DBG] pgmap v70: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.216 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:33 vm11 bash[21198]: cluster 2026-03-09T21:58:32.280656+0000 mgr.x (mgr.14150) 81 : cluster [DBG] pgmap v70: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.216 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:35 vm11 bash[21198]: cluster 2026-03-09T21:58:34.281004+0000 mgr.x (mgr.14150) 82 : cluster [DBG] pgmap v71: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.216 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:35 vm11 bash[21198]: cluster 2026-03-09T21:58:34.281004+0000 mgr.x (mgr.14150) 82 : cluster [DBG] pgmap v71: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.216 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:37 vm11 bash[21198]: cluster 2026-03-09T21:58:36.281319+0000 mgr.x (mgr.14150) 83 : cluster [DBG] pgmap v72: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.216 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:37 vm11 bash[21198]: cluster 2026-03-09T21:58:36.281319+0000 mgr.x (mgr.14150) 83 : cluster [DBG] pgmap v72: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-09T21:58:39.216 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:37 vm11 bash[21198]: debug 2026-03-09T21:58:37.882+0000 7fd89faf3640 1 mon.a@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 2026-03-09T21:58:39.228 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- stop 2026-03-09T21:58:39.320 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_return_code 3 sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ local expected_code=3 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ shift 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ local 'command=sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status' 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ set +e 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status' 2026-03-09T21:58:43.656 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status 2026-03-09T21:58:43.745 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:Non-zero exit code 3 from systemctl status ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout ○ ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service - Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Loaded: loaded (/etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef@.service; enabled; vendor preset: enabled) 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Active: inactive (dead) since Mon 2026-03-09 21:58:43 UTC; 4s ago 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Process: 21181 ExecStart=/bin/bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.run (code=exited, status=0/SUCCESS) 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Process: 38063 ExecStop=/bin/bash -c bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.stop (code=exited, status=0/SUCCESS) 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Process: 38119 ExecStopPost=/bin/bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.poststop (code=exited, status=0/SUCCESS) 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Main PID: 21181 (code=exited, status=0/SUCCESS) 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout CPU: 126ms 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:42 vm11 bash[21198]: debug 2026-03-09T21:58:42.882+0000 7fd89faf3640 1 mon.a@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 systemd[1]: Stopping Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef... 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 bash[21198]: debug 2026-03-09T21:58:43.286+0000 7fd8a2af9640 -1 received signal: Terminated from /sbin/docker-init -- /usr/bin/ceph-mon -n mon.a -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-stderr=true --default-log-stderr-prefix=debug --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-stderr=true (PID: 1) UID: 0 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 bash[21198]: debug 2026-03-09T21:58:43.286+0000 7fd8a2af9640 -1 mon.a@0(leader) e2 *** Got Signal Terminated *** 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 bash[21198]: debug 2026-03-09T21:58:43.286+0000 7fd8a2af9640 1 mon.a@0(leader) e2 shutdown 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 bash[21198]: debug 2026-03-09T21:58:43.294+0000 7fd8a451ed80 4 rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 bash[21198]: debug 2026-03-09T21:58:43.294+0000 7fd8a451ed80 4 rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 bash[38086]: ceph-00000000-0000-0000-0000-0000deadbeef-mon-a 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 systemd[1]: ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service: Deactivated successfully. 2026-03-09T21:58:48.300 INFO:tasks.workunit.client.0.vm11.stderr:stdout Mar 09 21:58:43 vm11 systemd[1]: Stopped Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef. 2026-03-09T21:58:48.313 INFO:tasks.workunit.client.0.vm11.stderr:+ local return_code=3 2026-03-09T21:58:48.313 INFO:tasks.workunit.client.0.vm11.stderr:+ set -e 2026-03-09T21:58:48.313 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' '!' 3 -eq 3 ']' 2026-03-09T21:58:48.313 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:58:48.313 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- start 2026-03-09T21:58:48.411 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:58:52.353 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -- true 2026-03-09T21:58:56.364 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-09T21:58:56.527 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -- test -d /var/log/ceph 2026-03-09T21:59:00.401 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-09T21:59:00.567 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 10 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 60 2026-03-09T21:59:00.567 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:59:00.567 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 10 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 60 2026-03-09T21:59:00.567 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 10 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 60 2026-03-09T21:59:04.439 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-09T21:59:14.440 INFO:tasks.workunit.client.0.vm11.stderr:ERROR: Command `['/usr/bin/docker', 'run', '--rm', '--ipc=host', '--net=host', '--privileged', '--group-add=disk', '--init', '-i', '-e', 'CONTAINER_IMAGE=quay.ceph.io/ceph-ci/ceph:squid', '-e', 'NODE_NAME=vm11', '-v', '/var/run/ceph/00000000-0000-0000-0000-0000deadbeef:/var/run/ceph:z', '-v', '/var/log/ceph/00000000-0000-0000-0000-0000deadbeef:/var/log/ceph:z', '-v', '/var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/crash:/var/lib/ceph/crash:z', '-v', '/var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config:/etc/ceph/ceph.conf:z', '-v', '/var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/config/ceph.client.admin.keyring:/etc/ceph/ceph.keyring:z', '--entrypoint', 'sleep', 'quay.ceph.io/ceph-ci/ceph:squid', '60']` timed out after 10 seconds 2026-03-09T21:59:14.458 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:59:14.458 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 60 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 10 2026-03-09T21:59:19.101 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-09T21:59:29.409 INFO:tasks.workunit.client.0.vm11.stderr:++ basename tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:59:29.411 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --mount tmp.test_cephadm.sh.Qa6KUG tmp.test_cephadm.sh.Lgi6Ng -- stat /mnt/tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:59:34.065 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-09T21:59:34.160 INFO:tasks.workunit.client.0.vm11.stdout: File: /mnt/tmp.test_cephadm.sh.Qa6KUG 2026-03-09T21:59:34.160 INFO:tasks.workunit.client.0.vm11.stdout: Size: 4096 Blocks: 8 IO Block: 4096 directory 2026-03-09T21:59:34.160 INFO:tasks.workunit.client.0.vm11.stdout:Device: fe01h/65025d Inode: 1046139 Links: 2 2026-03-09T21:59:34.160 INFO:tasks.workunit.client.0.vm11.stdout:Access: (0700/drwx------) Uid: ( 1000/ UNKNOWN) Gid: ( 1000/ UNKNOWN) 2026-03-09T21:59:34.161 INFO:tasks.workunit.client.0.vm11.stdout:Access: 2026-03-09 21:55:24.974846593 +0000 2026-03-09T21:59:34.161 INFO:tasks.workunit.client.0.vm11.stdout:Modify: 2026-03-09 21:57:10.790846593 +0000 2026-03-09T21:59:34.161 INFO:tasks.workunit.client.0.vm11.stdout:Change: 2026-03-09 21:57:10.790846593 +0000 2026-03-09T21:59:34.161 INFO:tasks.workunit.client.0.vm11.stdout: Birth: 2026-03-09 21:55:24.974846593 +0000 2026-03-09T21:59:34.210 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter 2026-03-09T21:59:34.210 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:59:34.210 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter 2026-03-09T21:59:34.210 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter 2026-03-09T21:59:34.304 INFO:tasks.workunit.client.0.vm11.stderr:usage: cephadm enter [-h] [--fsid FSID] --name NAME ... 2026-03-09T21:59:34.304 INFO:tasks.workunit.client.0.vm11.stderr:cephadm enter: error: the following arguments are required: --name/-n 2026-03-09T21:59:34.320 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:59:34.320 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- test -d /var/lib/ceph/mon/ceph-a 2026-03-09T21:59:34.412 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:59:34.464 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- test -d /var/lib/ceph/mgr/ceph-x 2026-03-09T21:59:34.557 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mgr.x/config 2026-03-09T21:59:34.628 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- pidof ceph-mon 2026-03-09T21:59:34.719 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:59:34.750 INFO:tasks.workunit.client.0.vm11.stdout:7 2026-03-09T21:59:34.773 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mon 2026-03-09T21:59:34.774 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:59:34.774 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mon 2026-03-09T21:59:34.774 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mon 2026-03-09T21:59:34.866 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mgr.x/config 2026-03-09T21:59:34.930 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:59:34.930 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mgr 2026-03-09T21:59:35.018 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mgr.x/config 2026-03-09T21:59:35.060 INFO:tasks.workunit.client.0.vm11.stdout:8 2026-03-09T21:59:35.082 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 60 enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- sleep 10 2026-03-09T21:59:35.172 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-09T21:59:45.234 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef -- inventory --format=json 2026-03-09T21:59:45.234 INFO:tasks.workunit.client.0.vm11.stderr:+ jq '.[]' 2026-03-09T21:59:49.872 INFO:tasks.workunit.client.0.vm11.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vdb", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "sys_api": { 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "removable": "0", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "ro": "0", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "0x1af4", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "model": "", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "sas_address": "", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "sas_device_handle": "", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "support_discard": "512", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "rotational": "1", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "nr_requests": "256", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "partitions": {}, 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "device_nodes": [ 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "vdb" 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "actuators": null, 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "scheduler_mode": "none", 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": 0, 2026-03-09T21:59:50.475 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": "512", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "size": 21474836480, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "20.00 GB", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vdb", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "devname": "vdb", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "type": "disk", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "parent": "/dev/vdb", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "id_bus": "" 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "ceph_device_lvm": false, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "being_replaced": false, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "lsm_data": {}, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "available": true, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "rejected_reasons": [], 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "device_id": "DWNBRSTVMM11001", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "lvs": [] 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vdc", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sys_api": { 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "removable": "0", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "ro": "0", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "0x1af4", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "model": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sas_address": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sas_device_handle": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "support_discard": "512", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "rotational": "1", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "nr_requests": "256", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "partitions": {}, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "device_nodes": [ 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "vdc" 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "actuators": null, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "scheduler_mode": "none", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": 0, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": "512", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "size": 21474836480, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "20.00 GB", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vdc", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "devname": "vdc", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "type": "disk", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "parent": "/dev/vdc", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "id_bus": "" 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "ceph_device_lvm": false, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "being_replaced": false, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "lsm_data": {}, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "available": true, 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "rejected_reasons": [], 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "device_id": "DWNBRSTVMM11002", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "lvs": [] 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vdd", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sys_api": { 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "removable": "0", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "ro": "0", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "0x1af4", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "model": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sas_address": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "sas_device_handle": "", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "support_discard": "512", 2026-03-09T21:59:50.476 INFO:tasks.workunit.client.0.vm11.stdout: "rotational": "1", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "nr_requests": "256", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "partitions": {}, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "device_nodes": [ 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "vdd" 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "actuators": null, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "scheduler_mode": "none", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": 0, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": "512", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "size": 21474836480, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "20.00 GB", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vdd", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "devname": "vdd", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "type": "disk", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "parent": "/dev/vdd", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "id_bus": "" 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "ceph_device_lvm": false, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "being_replaced": false, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "lsm_data": {}, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "available": true, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "rejected_reasons": [], 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "device_id": "DWNBRSTVMM11003", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "lvs": [] 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vde", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sys_api": { 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "removable": "0", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "ro": "0", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "0x1af4", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "model": "", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sas_address": "", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sas_device_handle": "", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "support_discard": "512", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "rotational": "1", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "nr_requests": "256", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "partitions": {}, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "device_nodes": [ 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "vde" 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "actuators": null, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "scheduler_mode": "none", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": 0, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": "512", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "size": 21474836480, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "20.00 GB", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vde", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "devname": "vde", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "type": "disk", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "parent": "/dev/vde", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "id_bus": "" 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "ceph_device_lvm": false, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "being_replaced": false, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "lsm_data": {}, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "available": true, 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "rejected_reasons": [], 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "device_id": "DWNBRSTVMM11004", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "lvs": [] 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/sr0", 2026-03-09T21:59:50.477 INFO:tasks.workunit.client.0.vm11.stdout: "sys_api": { 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "removable": "1", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "ro": "0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "QEMU", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "model": "QEMU DVD-ROM", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "2.5+", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sas_address": "", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sas_device_handle": "", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "support_discard": "0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "rotational": "1", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "nr_requests": "2", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "partitions": {}, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "device_nodes": [ 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sr0" 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "actuators": null, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "scheduler_mode": "mq-deadline", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": 0, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": "2048", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "size": 374784, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "366.00 KB", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/sr0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "devname": "sr0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "type": "disk", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "parent": "/dev/sr0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "id_bus": "ata" 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "ceph_device_lvm": false, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "being_replaced": false, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "lsm_data": {}, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "available": false, 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "rejected_reasons": [ 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "Insufficient space (<5GB)", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "Has a FileSystem" 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "device_id": "QEMU_DVD-ROM_QM00003", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "lvs": [] 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout:{ 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vda", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sys_api": { 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "removable": "0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "ro": "0", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "vendor": "0x1af4", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "model": "", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "rev": "", 2026-03-09T21:59:50.478 INFO:tasks.workunit.client.0.vm11.stdout: "sas_address": "", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sas_device_handle": "", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "support_discard": "512", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "rotational": "1", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "nr_requests": "256", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "partitions": { 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "vda15": { 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "start": "10240", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": "217088", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": 512, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "size": 111149056, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "106.00 MB", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "holders": [] 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "vda1": { 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "start": "227328", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": "83658719", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": 512, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "size": 42833264128, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "39.89 GB", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "holders": [] 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "vda14": { 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "start": "2048", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": "8192", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": 512, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "size": 4194304, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "4.00 MB", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "holders": [] 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: } 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "device_nodes": [ 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "vda" 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "actuators": null, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "scheduler_mode": "none", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectors": 0, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "sectorsize": "512", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "size": 42949672960, 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "human_readable_size": "40.00 GB", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "path": "/dev/vda", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "devname": "vda", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "type": "disk", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "parent": "/dev/vda", 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: "id_bus": "" 2026-03-09T21:59:50.492 INFO:tasks.workunit.client.0.vm11.stdout: }, 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "ceph_device_lvm": false, 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "being_replaced": false, 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "lsm_data": {}, 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "available": false, 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "rejected_reasons": [ 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "Has GPT headers", 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "Has partitions" 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: ], 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "device_id": "", 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout: "lvs": [] 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stdout:} 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' true = false ']' 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a 2026-03-09T21:59:50.493 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a 2026-03-09T21:59:50.596 INFO:tasks.workunit.client.0.vm11.stderr:ERROR: must pass --force to proceed: this command may destroy precious data! 2026-03-09T21:59:50.612 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:59:50.612 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x 2026-03-09T21:59:51.087 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:59:51.087 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:59:51.087 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:59:51.087 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef 2026-03-09T21:59:51.170 INFO:tasks.workunit.client.0.vm11.stderr:ERROR: must pass --force to proceed: this command may destroy precious data! 2026-03-09T21:59:51.187 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:59:51.187 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef --force 2026-03-09T21:59:51.752 INFO:tasks.workunit.client.0.vm11.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --zap-osds 2026-03-09T21:59:51.752 INFO:tasks.workunit.client.0.vm11.stderr:+ set -x 2026-03-09T21:59:51.752 INFO:tasks.workunit.client.0.vm11.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --zap-osds 2026-03-09T21:59:51.752 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --zap-osds 2026-03-09T21:59:51.844 INFO:tasks.workunit.client.0.vm11.stderr:ERROR: must pass --force to proceed: this command may destroy precious data! 2026-03-09T21:59:51.858 INFO:tasks.workunit.client.0.vm11.stderr:+ return 0 2026-03-09T21:59:51.858 INFO:tasks.workunit.client.0.vm11.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --force --zap-osds 2026-03-09T21:59:51.944 INFO:tasks.workunit.client.0.vm11.stdout:Deleting cluster with fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-09T22:00:11.937 INFO:tasks.workunit.client.0.vm11.stdout:PASS 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:+ echo PASS 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:+ cleanup 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:+ '[' true = false ']' 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:+ dump_all_logs 00000000-0000-0000-0000-0000deadbeef 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:+ local fsid=00000000-0000-0000-0000-0000deadbeef 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-09T22:00:11.938 INFO:tasks.workunit.client.0.vm11.stderr:++ jq -r '.[] | select(.fsid == "00000000-0000-0000-0000-0000deadbeef").name' 2026-03-09T22:00:13.063 INFO:tasks.workunit.client.0.vm11.stdout:dumping logs for daemons: 2026-03-09T22:00:13.063 INFO:tasks.workunit.client.0.vm11.stderr:+ local names= 2026-03-09T22:00:13.063 INFO:tasks.workunit.client.0.vm11.stderr:+ echo 'dumping logs for daemons: ' 2026-03-09T22:00:13.063 INFO:tasks.workunit.client.0.vm11.stderr:+ rm -rf tmp.test_cephadm.sh.Qa6KUG 2026-03-09T22:00:13.064 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T22:00:13.064 DEBUG:teuthology.orchestra.run.vm11:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-09T22:00:13.115 INFO:tasks.workunit:Stopping ['cephadm/test_cephadm.sh'] on client.0... 2026-03-09T22:00:13.115 DEBUG:teuthology.orchestra.run.vm11:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-09T22:00:13.550 DEBUG:teuthology.parallel:result is None 2026-03-09T22:00:13.550 DEBUG:teuthology.orchestra.run.vm11:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T22:00:13.557 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T22:00:13.557 DEBUG:teuthology.orchestra.run.vm11:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-09T22:00:13.602 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T22:00:13.602 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-09T22:00:13.610 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-09T22:00:13.610 DEBUG:teuthology.orchestra.run.vm11:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T22:00:13.657 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-09T22:00:13.657 DEBUG:teuthology.orchestra.run.vm11:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-09T22:00:13.732 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:13.939 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:13.940 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:14.061 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:14.061 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-09T22:00:14.061 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-09T22:00:14.061 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:14.074 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:14.074 INFO:teuthology.orchestra.run.vm11.stdout: ceph* 2026-03-09T22:00:14.273 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-09T22:00:14.273 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-09T22:00:14.312 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118605 files and directories currently installed.) 2026-03-09T22:00:14.314 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:15.366 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:15.403 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:15.590 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:15.590 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:15.715 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:15.715 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-09T22:00:15.716 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-09T22:00:15.716 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:15.726 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:15.726 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-cephadm* cephadm* 2026-03-09T22:00:15.888 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-09T22:00:15.888 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 1775 kB disk space will be freed. 2026-03-09T22:00:15.926 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118603 files and directories currently installed.) 2026-03-09T22:00:15.928 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:15.945 INFO:teuthology.orchestra.run.vm11.stdout:Removing cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:15.976 INFO:teuthology.orchestra.run.vm11.stdout:Looking for files to backup/remove ... 2026-03-09T22:00:15.978 INFO:teuthology.orchestra.run.vm11.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-09T22:00:15.980 INFO:teuthology.orchestra.run.vm11.stdout:Removing user `cephadm' ... 2026-03-09T22:00:15.980 INFO:teuthology.orchestra.run.vm11.stdout:Warning: group `nogroup' has no more members. 2026-03-09T22:00:15.991 INFO:teuthology.orchestra.run.vm11.stdout:Done. 2026-03-09T22:00:16.014 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T22:00:16.124 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-09T22:00:16.126 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:17.149 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:17.183 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:17.371 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:17.372 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:17.525 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:17.526 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-09T22:00:17.527 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-09T22:00:17.527 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:17.544 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:17.545 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mds* 2026-03-09T22:00:17.738 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-09T22:00:17.738 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 7437 kB disk space will be freed. 2026-03-09T22:00:17.773 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-09T22:00:17.774 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:18.230 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T22:00:18.321 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-09T22:00:18.323 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:19.770 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:19.803 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:19.982 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:19.982 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:20.145 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools python3-cheroot 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-psutil python3-pyinotify 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-09T22:00:20.146 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-09T22:00:20.147 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-09T22:00:20.147 INFO:teuthology.orchestra.run.vm11.stdout: python3-threadpoolctl python3-waitress python3-webob python3-websocket 2026-03-09T22:00:20.147 INFO:teuthology.orchestra.run.vm11.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-09T22:00:20.147 INFO:teuthology.orchestra.run.vm11.stdout: sg3-utils-udev 2026-03-09T22:00:20.147 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:20.164 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:20.165 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-09T22:00:20.165 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-k8sevents* 2026-03-09T22:00:20.322 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 4 to remove and 10 not upgraded. 2026-03-09T22:00:20.322 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 165 MB disk space will be freed. 2026-03-09T22:00:20.360 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-09T22:00:20.362 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:20.373 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:20.392 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:20.421 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:20.889 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-09T22:00:20.892 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:22.289 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:22.322 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:22.498 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:22.499 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:22.607 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:22.607 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:22.607 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-09T22:00:22.608 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:22.619 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:22.620 INFO:teuthology.orchestra.run.vm11.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-09T22:00:22.782 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-09T22:00:22.782 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 472 MB disk space will be freed. 2026-03-09T22:00:22.818 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-09T22:00:22.820 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:22.879 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:23.286 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:23.728 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:24.131 INFO:teuthology.orchestra.run.vm11.stdout:Removing radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:24.540 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:24.572 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:25.000 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T22:00:25.037 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-09T22:00:25.109 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117456 files and directories currently installed.) 2026-03-09T22:00:25.111 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:25.692 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:26.080 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:26.513 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:26.972 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:28.340 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:28.373 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:28.567 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:28.568 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T22:00:28.679 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-09T22:00:28.680 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:28.687 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:28.688 INFO:teuthology.orchestra.run.vm11.stdout: ceph-fuse* 2026-03-09T22:00:28.839 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-09T22:00:28.839 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 3673 kB disk space will be freed. 2026-03-09T22:00:28.873 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117443 files and directories currently installed.) 2026-03-09T22:00:28.876 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:29.274 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T22:00:29.532 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-09T22:00:29.534 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:30.826 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:30.862 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:31.030 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:31.030 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-09T22:00:31.139 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:31.156 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:31.156 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:31.189 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:31.346 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:31.347 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-09T22:00:31.452 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-09T22:00:31.453 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:31.468 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:31.469 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:31.501 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:31.687 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:31.687 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout:Package 'radosgw' is not installed, so not removed 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-09T22:00:31.799 INFO:teuthology.orchestra.run.vm11.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-09T22:00:31.800 INFO:teuthology.orchestra.run.vm11.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-09T22:00:31.800 INFO:teuthology.orchestra.run.vm11.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-09T22:00:31.800 INFO:teuthology.orchestra.run.vm11.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-09T22:00:31.800 INFO:teuthology.orchestra.run.vm11.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-09T22:00:31.800 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:31.815 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:31.815 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:31.846 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:32.025 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:32.026 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:32.147 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout: xmlstarlet zip 2026-03-09T22:00:32.148 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:32.157 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:32.157 INFO:teuthology.orchestra.run.vm11.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-09T22:00:32.313 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 3 to remove and 10 not upgraded. 2026-03-09T22:00:32.313 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 2062 kB disk space will be freed. 2026-03-09T22:00:32.349 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-09T22:00:32.351 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:32.363 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:32.374 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:33.393 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:33.426 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:33.612 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:33.612 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:33.748 INFO:teuthology.orchestra.run.vm11.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-09T22:00:33.748 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:33.748 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:33.748 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-09T22:00:33.748 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout: xmlstarlet zip 2026-03-09T22:00:33.749 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:33.769 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:33.769 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:33.800 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:33.984 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:33.984 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout: xmlstarlet zip 2026-03-09T22:00:34.135 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:34.151 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:34.176 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:34.182 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:34.344 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:34.345 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:34.458 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout: xmlstarlet zip 2026-03-09T22:00:34.459 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:34.470 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:34.470 INFO:teuthology.orchestra.run.vm11.stdout: python3-rbd* 2026-03-09T22:00:34.629 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-09T22:00:34.629 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 1186 kB disk space will be freed. 2026-03-09T22:00:34.664 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117410 files and directories currently installed.) 2026-03-09T22:00:34.666 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:35.636 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:35.668 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:35.839 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:35.840 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout: xmlstarlet zip 2026-03-09T22:00:35.962 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:35.972 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:35.973 INFO:teuthology.orchestra.run.vm11.stdout: libcephfs-dev* libcephfs2* 2026-03-09T22:00:36.127 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-09T22:00:36.127 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 3202 kB disk space will be freed. 2026-03-09T22:00:36.162 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117402 files and directories currently installed.) 2026-03-09T22:00:36.164 INFO:teuthology.orchestra.run.vm11.stdout:Removing libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:36.176 INFO:teuthology.orchestra.run.vm11.stdout:Removing libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:36.199 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-09T22:00:37.178 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:37.216 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:37.397 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:37.398 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout: xmlstarlet zip 2026-03-09T22:00:37.523 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:37.543 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:37.543 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:37.575 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:37.758 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:37.759 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-09T22:00:37.884 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:37.891 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:37.892 INFO:teuthology.orchestra.run.vm11.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-09T22:00:37.892 INFO:teuthology.orchestra.run.vm11.stdout: qemu-block-extra* rbd-fuse* 2026-03-09T22:00:38.053 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-09T22:00:38.053 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 51.6 MB disk space will be freed. 2026-03-09T22:00:38.089 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117387 files and directories currently installed.) 2026-03-09T22:00:38.091 INFO:teuthology.orchestra.run.vm11.stdout:Removing rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:38.103 INFO:teuthology.orchestra.run.vm11.stdout:Removing libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:38.115 INFO:teuthology.orchestra.run.vm11.stdout:Removing libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:38.125 INFO:teuthology.orchestra.run.vm11.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-09T22:00:38.555 INFO:teuthology.orchestra.run.vm11.stdout:Removing librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:38.566 INFO:teuthology.orchestra.run.vm11.stdout:Removing librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:38.577 INFO:teuthology.orchestra.run.vm11.stdout:Removing librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:38.602 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T22:00:38.640 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-09T22:00:38.714 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-09T22:00:38.716 INFO:teuthology.orchestra.run.vm11.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-09T22:00:40.254 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:40.286 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:40.466 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:40.467 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout:Package 'librbd1' is not installed, so not removed 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-09T22:00:40.587 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:40.604 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:40.604 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:40.635 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:40.811 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:40.812 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:40.928 INFO:teuthology.orchestra.run.vm11.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout:The following packages were automatically installed and are no longer required: 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-09T22:00:40.929 INFO:teuthology.orchestra.run.vm11.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-09T22:00:40.946 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-09T22:00:40.946 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:40.947 DEBUG:teuthology.orchestra.run.vm11:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-09T22:00:41.003 DEBUG:teuthology.orchestra.run.vm11:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-09T22:00:41.078 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:41.261 INFO:teuthology.orchestra.run.vm11.stdout:Building dependency tree... 2026-03-09T22:00:41.262 INFO:teuthology.orchestra.run.vm11.stdout:Reading state information... 2026-03-09T22:00:41.390 INFO:teuthology.orchestra.run.vm11.stdout:The following packages will be REMOVED: 2026-03-09T22:00:41.390 INFO:teuthology.orchestra.run.vm11.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-09T22:00:41.390 INFO:teuthology.orchestra.run.vm11.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-09T22:00:41.390 INFO:teuthology.orchestra.run.vm11.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-09T22:00:41.390 INFO:teuthology.orchestra.run.vm11.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-09T22:00:41.390 INFO:teuthology.orchestra.run.vm11.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-09T22:00:41.391 INFO:teuthology.orchestra.run.vm11.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-09T22:00:41.548 INFO:teuthology.orchestra.run.vm11.stdout:0 upgraded, 0 newly installed, 87 to remove and 10 not upgraded. 2026-03-09T22:00:41.549 INFO:teuthology.orchestra.run.vm11.stdout:After this operation, 107 MB disk space will be freed. 2026-03-09T22:00:41.582 INFO:teuthology.orchestra.run.vm11.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-09T22:00:41.584 INFO:teuthology.orchestra.run.vm11.stdout:Removing ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:41.600 INFO:teuthology.orchestra.run.vm11.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-09T22:00:41.613 INFO:teuthology.orchestra.run.vm11.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-09T22:00:41.625 INFO:teuthology.orchestra.run.vm11.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-09T22:00:41.638 INFO:teuthology.orchestra.run.vm11.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-09T22:00:41.650 INFO:teuthology.orchestra.run.vm11.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-09T22:00:41.663 INFO:teuthology.orchestra.run.vm11.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T22:00:41.675 INFO:teuthology.orchestra.run.vm11.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T22:00:41.687 INFO:teuthology.orchestra.run.vm11.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-09T22:00:41.705 INFO:teuthology.orchestra.run.vm11.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-09T22:00:41.716 INFO:teuthology.orchestra.run.vm11.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-09T22:00:41.727 INFO:teuthology.orchestra.run.vm11.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-09T22:00:41.739 INFO:teuthology.orchestra.run.vm11.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-09T22:00:41.750 INFO:teuthology.orchestra.run.vm11.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-09T22:00:41.761 INFO:teuthology.orchestra.run.vm11.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-09T22:00:41.772 INFO:teuthology.orchestra.run.vm11.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-09T22:00:41.783 INFO:teuthology.orchestra.run.vm11.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-09T22:00:41.795 INFO:teuthology.orchestra.run.vm11.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-09T22:00:41.807 INFO:teuthology.orchestra.run.vm11.stdout:Removing luarocks (3.8.0+dfsg1-1) ... 2026-03-09T22:00:41.834 INFO:teuthology.orchestra.run.vm11.stdout:Removing liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-09T22:00:41.846 INFO:teuthology.orchestra.run.vm11.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-09T22:00:41.857 INFO:teuthology.orchestra.run.vm11.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-09T22:00:41.868 INFO:teuthology.orchestra.run.vm11.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-09T22:00:41.880 INFO:teuthology.orchestra.run.vm11.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-09T22:00:41.891 INFO:teuthology.orchestra.run.vm11.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-09T22:00:41.902 INFO:teuthology.orchestra.run.vm11.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-09T22:00:41.913 INFO:teuthology.orchestra.run.vm11.stdout:Removing libreadline-dev:amd64 (8.1.2-1) ... 2026-03-09T22:00:41.924 INFO:teuthology.orchestra.run.vm11.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-09T22:00:41.932 INFO:teuthology.orchestra.run.vm11.stdout:update-initramfs: deferring update (trigger activated) 2026-03-09T22:00:41.942 INFO:teuthology.orchestra.run.vm11.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-09T22:00:41.959 INFO:teuthology.orchestra.run.vm11.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-09T22:00:41.971 INFO:teuthology.orchestra.run.vm11.stdout:Removing lua-any (27ubuntu1) ... 2026-03-09T22:00:41.982 INFO:teuthology.orchestra.run.vm11.stdout:Removing lua-sec:amd64 (1.0.2-1) ... 2026-03-09T22:00:41.994 INFO:teuthology.orchestra.run.vm11.stdout:Removing lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-09T22:00:42.007 INFO:teuthology.orchestra.run.vm11.stdout:Removing lua5.1 (5.1.5-8.1build4) ... 2026-03-09T22:00:42.026 INFO:teuthology.orchestra.run.vm11.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-09T22:00:42.423 INFO:teuthology.orchestra.run.vm11.stdout:Removing pkg-config (0.29.2-1ubuntu3) ... 2026-03-09T22:00:42.454 INFO:teuthology.orchestra.run.vm11.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-09T22:00:42.478 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-pecan (1.3.3-4ubuntu2) ... 2026-03-09T22:00:42.533 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-webtest (2.0.35-1) ... 2026-03-09T22:00:42.580 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-pastescript (2.0.2-4) ... 2026-03-09T22:00:42.638 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-pastedeploy (2.1.1-1) ... 2026-03-09T22:00:42.687 INFO:teuthology.orchestra.run.vm11.stdout:Removing python-pastedeploy-tpl (2.1.1-1) ... 2026-03-09T22:00:42.698 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-09T22:00:42.752 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-09T22:00:43.005 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-09T22:00:43.055 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-09T22:00:43.100 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:43.146 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-09T22:00:43.195 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-09T22:00:43.252 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-09T22:00:43.302 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-09T22:00:43.348 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-09T22:00:43.395 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-09T22:00:43.440 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-09T22:00:43.486 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-09T22:00:43.530 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-09T22:00:43.576 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-09T22:00:43.690 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-09T22:00:43.748 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-logutils (0.3.3-8) ... 2026-03-09T22:00:43.793 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-09T22:00:43.841 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-09T22:00:43.888 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-paste (3.5.0+dfsg1-1) ... 2026-03-09T22:00:43.944 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-09T22:00:43.992 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-09T22:00:44.046 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-pyinotify (0.9.6-1.3) ... 2026-03-09T22:00:44.093 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-09T22:00:44.139 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-09T22:00:44.190 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-09T22:00:44.240 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-rsa (4.8-1) ... 2026-03-09T22:00:44.290 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-simplegeneric (0.8.1-3) ... 2026-03-09T22:00:44.334 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-09T22:00:44.385 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-singledispatch (3.4.0.3-3) ... 2026-03-09T22:00:44.435 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-09T22:00:44.564 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-tempita (0.5.2-6ubuntu1) ... 2026-03-09T22:00:44.650 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-09T22:00:44.693 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-09T22:00:44.738 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-09T22:00:44.784 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-09T22:00:44.829 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-09T22:00:44.876 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-09T22:00:44.925 INFO:teuthology.orchestra.run.vm11.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-09T22:00:44.971 INFO:teuthology.orchestra.run.vm11.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-09T22:00:44.993 INFO:teuthology.orchestra.run.vm11.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-09T22:00:45.388 INFO:teuthology.orchestra.run.vm11.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-09T22:00:45.400 INFO:teuthology.orchestra.run.vm11.stdout:Removing unzip (6.0-26ubuntu3.2) ... 2026-03-09T22:00:45.419 INFO:teuthology.orchestra.run.vm11.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-09T22:00:45.436 INFO:teuthology.orchestra.run.vm11.stdout:Removing zip (3.0-12build2) ... 2026-03-09T22:00:45.462 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-09T22:00:45.472 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-09T22:00:45.516 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-09T22:00:45.524 INFO:teuthology.orchestra.run.vm11.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-09T22:00:45.542 INFO:teuthology.orchestra.run.vm11.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-1092-kvm 2026-03-09T22:00:46.992 INFO:teuthology.orchestra.run.vm11.stdout:W: mkconf: MD subsystem is not loaded, thus I cannot scan for arrays. 2026-03-09T22:00:46.994 INFO:teuthology.orchestra.run.vm11.stdout:W: mdadm: failed to auto-generate temporary mdadm.conf file. 2026-03-09T22:00:48.875 INFO:teuthology.orchestra.run.vm11.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-09T22:00:48.877 DEBUG:teuthology.parallel:result is None 2026-03-09T22:00:48.877 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm11.local 2026-03-09T22:00:48.878 DEBUG:teuthology.orchestra.run.vm11:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-09T22:00:48.926 DEBUG:teuthology.orchestra.run.vm11:> sudo apt-get update 2026-03-09T22:00:49.101 INFO:teuthology.orchestra.run.vm11.stdout:Hit:1 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-09T22:00:49.191 INFO:teuthology.orchestra.run.vm11.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-09T22:00:49.199 INFO:teuthology.orchestra.run.vm11.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-09T22:00:49.207 INFO:teuthology.orchestra.run.vm11.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-09T22:00:50.076 INFO:teuthology.orchestra.run.vm11.stdout:Reading package lists... 2026-03-09T22:00:50.089 DEBUG:teuthology.parallel:result is None 2026-03-09T22:00:50.089 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-09T22:00:50.091 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-09T22:00:50.092 DEBUG:teuthology.orchestra.run.vm11:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout: remote refid st t when poll reach delay offset jitter 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:============================================================================== 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:+static.187.15.9 194.59.205.229 3 u 21 64 177 25.195 +0.502 0.229 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:*79.133.44.139 .MBGh. 1 u 17 64 177 20.482 +0.098 0.476 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:+ernie.gerger-ne 213.172.96.14 3 u 14 64 177 31.854 +0.063 0.331 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:+static.222.16.4 35.73.197.144 2 u 20 64 177 0.419 +0.422 1.293 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:-ec2-18-192-244- 216.239.35.8 2 u 25 64 177 23.341 -0.839 0.409 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:-ntp3.uni-ulm.de 129.69.253.1 2 u 15 64 177 27.362 -0.662 0.389 2026-03-09T22:00:51.349 INFO:teuthology.orchestra.run.vm11.stdout:#172-104-149-161 82.35.162.146 2 u 19 64 177 22.469 -5.360 0.228 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:+185.252.140.125 216.239.35.4 2 u 18 64 177 25.099 +0.508 0.359 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:-mail.vbrandl.ne 71.58.123.92 3 u 20 64 177 24.940 -0.519 0.183 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:-v22025082392863 129.69.253.1 2 u 13 64 177 28.223 -1.471 0.560 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:#47.ip-51-75-67. 225.254.30.190 4 u 357 64 140 21.224 +0.955 1.596 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:-timegoesbrrr.ne 131.188.3.222 2 u 16 64 177 28.277 -2.110 0.345 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:#ip217-154-182-6 37.15.221.189 2 u 19 64 177 67.412 -8.919 0.269 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:#alphyn.canonica 132.163.96.1 2 u 31 64 157 102.025 -3.194 1.020 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:#139-144-71-56.i 82.35.162.146 2 u 11 64 177 22.736 -4.518 0.356 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:#217.160.19.219 10.50.0.2 2 u 9 64 177 29.038 -3.533 0.181 2026-03-09T22:00:51.350 INFO:teuthology.orchestra.run.vm11.stdout:#185.125.190.56 79.243.60.50 2 u 29 64 177 36.680 +1.546 1.948 2026-03-09T22:00:51.350 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-09T22:00:51.360 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-09T22:00:51.360 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-09T22:00:51.362 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-09T22:00:51.364 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-09T22:00:51.368 INFO:teuthology.task.internal:Duration was 589.681569 seconds 2026-03-09T22:00:51.368 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-09T22:00:51.371 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-09T22:00:51.371 DEBUG:teuthology.orchestra.run.vm11:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T22:00:51.391 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-09T22:00:51.392 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm11.local 2026-03-09T22:00:51.392 DEBUG:teuthology.orchestra.run.vm11:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T22:00:51.444 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-09T22:00:51.444 DEBUG:teuthology.orchestra.run.vm11:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T22:00:51.585 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-09T22:00:51.585 DEBUG:teuthology.orchestra.run.vm11:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T22:00:51.592 INFO:teuthology.orchestra.run.vm11.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T22:00:51.592 INFO:teuthology.orchestra.run.vm11.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T22:00:51.592 INFO:teuthology.orchestra.run.vm11.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T22:00:51.592 INFO:teuthology.orchestra.run.vm11.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T22:00:51.593 INFO:teuthology.orchestra.run.vm11.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T22:00:51.610 INFO:teuthology.orchestra.run.vm11.stderr: 91.3% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T22:00:51.612 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-09T22:00:51.614 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-09T22:00:51.614 DEBUG:teuthology.orchestra.run.vm11:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T22:00:51.663 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-09T22:00:51.666 DEBUG:teuthology.orchestra.run.vm11:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T22:00:51.714 INFO:teuthology.orchestra.run.vm11.stdout:kernel.core_pattern = core 2026-03-09T22:00:51.721 DEBUG:teuthology.orchestra.run.vm11:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T22:00:51.766 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T22:00:51.766 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-09T22:00:51.768 INFO:teuthology.task.internal:Transferring archived files... 2026-03-09T22:00:51.768 DEBUG:teuthology.misc:Transferring archived files from vm11:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/668/remote/vm11 2026-03-09T22:00:51.768 DEBUG:teuthology.orchestra.run.vm11:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T22:00:51.817 INFO:teuthology.task.internal:Removing archive directory... 2026-03-09T22:00:51.817 DEBUG:teuthology.orchestra.run.vm11:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T22:00:51.863 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-09T22:00:51.865 INFO:teuthology.task.internal:Not uploading archives. 2026-03-09T22:00:51.865 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-09T22:00:51.867 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-09T22:00:51.867 DEBUG:teuthology.orchestra.run.vm11:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T22:00:51.906 INFO:teuthology.orchestra.run.vm11.stdout: 258077 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 9 22:00 /home/ubuntu/cephtest 2026-03-09T22:00:51.907 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-09T22:00:51.912 INFO:teuthology.run:Summary data: description: orch/cephadm/workunits/{0-distro/ubuntu_22.04 agent/off mon_election/classic task/test_cephadm} duration: 589.6815686225891 flavor: default owner: kyr success: true 2026-03-09T22:00:51.912 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T22:00:51.930 INFO:teuthology.run:pass