2026-03-08T22:36:55.940 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T22:36:55.943 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:36:55.968 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/277 branch: squid description: rados:standalone/{supported-random-distro$/{ubuntu_latest} workloads/mgr} email: null first_in_suite: false flavor: default job_id: '277' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 5909 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 suite: rados:standalone suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm01.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOqCT/tprXdUEaFGI9qk1vi8PLBOU5b/HRelhW4rLbPvkM9cnaUJnreT3jyqZni7iT4yUWP9KsCL8fL2SP0CLKY= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - mgr teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_21:49:43 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-08T22:36:55.968 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T22:36:55.968 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T22:36:55.968 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T22:36:55.969 INFO:teuthology.task.internal:Checking packages... 2026-03-08T22:36:55.969 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T22:36:55.969 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T22:36:55.969 INFO:teuthology.packaging:ref: None 2026-03-08T22:36:55.969 INFO:teuthology.packaging:tag: None 2026-03-08T22:36:55.969 INFO:teuthology.packaging:branch: squid 2026-03-08T22:36:55.969 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:36:55.969 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=squid 2026-03-08T22:36:56.623 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:36:56.624 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T22:36:56.625 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T22:36:56.625 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T22:36:56.625 INFO:teuthology.task.internal:Saving configuration 2026-03-08T22:36:56.629 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T22:36:56.630 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T22:36:56.636 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm01.local', 'description': '/archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/277', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 22:36:13.151192', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:01', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOqCT/tprXdUEaFGI9qk1vi8PLBOU5b/HRelhW4rLbPvkM9cnaUJnreT3jyqZni7iT4yUWP9KsCL8fL2SP0CLKY='} 2026-03-08T22:36:56.636 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T22:36:56.637 INFO:teuthology.task.internal:roles: ubuntu@vm01.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-08T22:36:56.637 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T22:36:56.644 DEBUG:teuthology.task.console_log:vm01 does not support IPMI; excluding 2026-03-08T22:36:56.644 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7ff85580c3a0>, signals=[15]) 2026-03-08T22:36:56.644 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T22:36:56.645 INFO:teuthology.task.internal:Opening connections... 2026-03-08T22:36:56.645 DEBUG:teuthology.task.internal:connecting to ubuntu@vm01.local 2026-03-08T22:36:56.646 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm01.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:36:56.706 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T22:36:56.707 DEBUG:teuthology.orchestra.run.vm01:> uname -m 2026-03-08T22:36:56.825 INFO:teuthology.orchestra.run.vm01.stdout:x86_64 2026-03-08T22:36:56.825 DEBUG:teuthology.orchestra.run.vm01:> cat /etc/os-release 2026-03-08T22:36:56.868 INFO:teuthology.orchestra.run.vm01.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-08T22:36:56.868 INFO:teuthology.orchestra.run.vm01.stdout:NAME="Ubuntu" 2026-03-08T22:36:56.868 INFO:teuthology.orchestra.run.vm01.stdout:VERSION_ID="22.04" 2026-03-08T22:36:56.868 INFO:teuthology.orchestra.run.vm01.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-08T22:36:56.868 INFO:teuthology.orchestra.run.vm01.stdout:VERSION_CODENAME=jammy 2026-03-08T22:36:56.868 INFO:teuthology.orchestra.run.vm01.stdout:ID=ubuntu 2026-03-08T22:36:56.869 INFO:teuthology.orchestra.run.vm01.stdout:ID_LIKE=debian 2026-03-08T22:36:56.869 INFO:teuthology.orchestra.run.vm01.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-08T22:36:56.869 INFO:teuthology.orchestra.run.vm01.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-08T22:36:56.869 INFO:teuthology.orchestra.run.vm01.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-08T22:36:56.869 INFO:teuthology.orchestra.run.vm01.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-08T22:36:56.869 INFO:teuthology.orchestra.run.vm01.stdout:UBUNTU_CODENAME=jammy 2026-03-08T22:36:56.869 INFO:teuthology.lock.ops:Updating vm01.local on lock server 2026-03-08T22:36:56.874 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T22:36:56.876 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T22:36:56.877 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T22:36:56.877 DEBUG:teuthology.orchestra.run.vm01:> test '!' -e /home/ubuntu/cephtest 2026-03-08T22:36:56.912 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T22:36:56.913 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T22:36:56.913 DEBUG:teuthology.orchestra.run.vm01:> test -z $(ls -A /var/lib/ceph) 2026-03-08T22:36:56.956 INFO:teuthology.orchestra.run.vm01.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T22:36:56.957 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T22:36:56.965 DEBUG:teuthology.orchestra.run.vm01:> test -e /ceph-qa-ready 2026-03-08T22:36:57.000 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:36:57.525 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T22:36:57.526 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T22:36:57.527 DEBUG:teuthology.orchestra.run.vm01:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T22:36:57.529 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T22:36:57.531 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T22:36:57.532 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T22:36:57.532 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T22:36:57.577 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T22:36:57.578 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T22:36:57.578 DEBUG:teuthology.orchestra.run.vm01:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T22:36:57.620 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:36:57.620 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T22:36:57.669 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:36:57.674 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:36:57.675 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T22:36:57.676 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T22:36:57.676 DEBUG:teuthology.orchestra.run.vm01:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T22:36:57.724 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T22:36:57.726 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T22:36:57.726 DEBUG:teuthology.orchestra.run.vm01:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T22:36:57.768 DEBUG:teuthology.orchestra.run.vm01:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:36:57.812 DEBUG:teuthology.orchestra.run.vm01:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:36:57.856 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:36:57.856 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T22:36:57.904 DEBUG:teuthology.orchestra.run.vm01:> sudo service rsyslog restart 2026-03-08T22:36:57.961 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T22:36:57.963 INFO:teuthology.task.internal:Starting timer... 2026-03-08T22:36:57.963 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T22:36:57.966 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T22:36:57.968 INFO:teuthology.task.selinux:Excluding vm01: VMs are not yet supported 2026-03-08T22:36:57.968 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T22:36:57.968 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T22:36:57.968 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T22:36:57.968 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T22:36:57.969 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T22:36:57.970 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-08T22:36:57.971 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-08T22:36:58.539 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T22:36:58.544 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T22:36:58.545 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventorykb7qxlie --limit vm01.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T22:38:57.140 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm01.local')] 2026-03-08T22:38:57.140 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm01.local' 2026-03-08T22:38:57.141 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm01.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:38:57.207 DEBUG:teuthology.orchestra.run.vm01:> true 2026-03-08T22:38:57.413 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm01.local' 2026-03-08T22:38:57.413 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T22:38:57.417 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T22:38:57.417 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T22:38:57.417 DEBUG:teuthology.orchestra.run.vm01:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Command line: ntpd -gq 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: ---------------------------------------------------- 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: ntp-4 is maintained by Network Time Foundation, 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: corporation. Support and training for ntp-4 are 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: available at https://www.nwtime.org/support 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: ---------------------------------------------------- 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: proto: precision = 0.030 usec (-25) 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: basedate set to 2022-02-04 2026-03-08T22:38:57.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: gps base set to 2022-02-06 (week 2196) 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stderr: 8 Mar 22:38:57 ntpd[15972]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 71 days ago 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listen and drop on 0 v6wildcard [::]:123 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listen normally on 2 lo 127.0.0.1:123 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listen normally on 3 ens3 192.168.123.101:123 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listen normally on 4 lo [::1]:123 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:1%2]:123 2026-03-08T22:38:57.469 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:57 ntpd[15972]: Listening on routing socket on fd #22 for interface updates 2026-03-08T22:38:58.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:58 ntpd[15972]: Soliciting pool server 213.239.234.28 2026-03-08T22:38:59.467 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:59 ntpd[15972]: Soliciting pool server 81.3.27.46 2026-03-08T22:38:59.468 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:38:59 ntpd[15972]: Soliciting pool server 194.59.205.229 2026-03-08T22:39:00.467 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:00 ntpd[15972]: Soliciting pool server 134.60.1.30 2026-03-08T22:39:00.467 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:00 ntpd[15972]: Soliciting pool server 158.101.188.125 2026-03-08T22:39:00.467 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:00 ntpd[15972]: Soliciting pool server 116.203.218.109 2026-03-08T22:39:01.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:01 ntpd[15972]: Soliciting pool server 46.38.244.94 2026-03-08T22:39:01.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:01 ntpd[15972]: Soliciting pool server 78.46.53.2 2026-03-08T22:39:01.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:01 ntpd[15972]: Soliciting pool server 94.16.122.152 2026-03-08T22:39:01.467 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:01 ntpd[15972]: Soliciting pool server 195.201.20.16 2026-03-08T22:39:02.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:02 ntpd[15972]: Soliciting pool server 185.216.176.59 2026-03-08T22:39:02.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:02 ntpd[15972]: Soliciting pool server 152.53.191.142 2026-03-08T22:39:02.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:02 ntpd[15972]: Soliciting pool server 139.162.156.95 2026-03-08T22:39:02.466 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:02 ntpd[15972]: Soliciting pool server 91.189.91.157 2026-03-08T22:39:03.465 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:03 ntpd[15972]: Soliciting pool server 185.125.190.56 2026-03-08T22:39:03.465 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:03 ntpd[15972]: Soliciting pool server 212.132.108.186 2026-03-08T22:39:03.465 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:03 ntpd[15972]: Soliciting pool server 193.203.3.170 2026-03-08T22:39:05.488 INFO:teuthology.orchestra.run.vm01.stdout: 8 Mar 22:39:05 ntpd[15972]: ntpd: time slew +0.012922 s 2026-03-08T22:39:05.488 INFO:teuthology.orchestra.run.vm01.stdout:ntpd: time slew +0.012922s 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout: remote refid st t when poll reach delay offset jitter 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout:============================================================================== 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:39:05.508 INFO:teuthology.orchestra.run.vm01.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:39:05.509 INFO:teuthology.run_tasks:Running task install... 2026-03-08T22:39:05.511 DEBUG:teuthology.task.install:project ceph 2026-03-08T22:39:05.511 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:39:05.511 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:39:05.511 INFO:teuthology.task.install:Using flavor: default 2026-03-08T22:39:05.514 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T22:39:05.514 INFO:teuthology.task.install:extra packages: [] 2026-03-08T22:39:05.515 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-key list | grep Ceph 2026-03-08T22:39:05.583 INFO:teuthology.orchestra.run.vm01.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-08T22:39:05.600 INFO:teuthology.orchestra.run.vm01.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-08T22:39:05.600 INFO:teuthology.orchestra.run.vm01.stdout:uid [ unknown] Ceph.com (release key) 2026-03-08T22:39:05.601 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-08T22:39:05.601 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-xmltodict, python3-jmespath on remote deb x86_64 2026-03-08T22:39:05.601 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:39:06.244 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default/ 2026-03-08T22:39:06.244 INFO:teuthology.task.install.deb:Package version is 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:39:06.777 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:39:06.777 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-08T22:39:06.786 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-get update 2026-03-08T22:39:07.054 INFO:teuthology.orchestra.run.vm01.stdout:Hit:1 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-08T22:39:07.339 INFO:teuthology.orchestra.run.vm01.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-08T22:39:07.443 INFO:teuthology.orchestra.run.vm01.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-08T22:39:07.487 INFO:teuthology.orchestra.run.vm01.stdout:Ign:4 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy InRelease 2026-03-08T22:39:07.547 INFO:teuthology.orchestra.run.vm01.stdout:Hit:5 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-08T22:39:07.600 INFO:teuthology.orchestra.run.vm01.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release [7662 B] 2026-03-08T22:39:07.713 INFO:teuthology.orchestra.run.vm01.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-08T22:39:07.826 INFO:teuthology.orchestra.run.vm01.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.1 kB] 2026-03-08T22:39:07.907 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 25.8 kB in 1s (26.6 kB/s) 2026-03-08T22:39:08.622 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:39:08.635 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=19.2.3-678-ge911bdeb-1jammy cephadm=19.2.3-678-ge911bdeb-1jammy ceph-mds=19.2.3-678-ge911bdeb-1jammy ceph-mgr=19.2.3-678-ge911bdeb-1jammy ceph-common=19.2.3-678-ge911bdeb-1jammy ceph-fuse=19.2.3-678-ge911bdeb-1jammy ceph-test=19.2.3-678-ge911bdeb-1jammy ceph-volume=19.2.3-678-ge911bdeb-1jammy radosgw=19.2.3-678-ge911bdeb-1jammy python3-rados=19.2.3-678-ge911bdeb-1jammy python3-rgw=19.2.3-678-ge911bdeb-1jammy python3-cephfs=19.2.3-678-ge911bdeb-1jammy python3-rbd=19.2.3-678-ge911bdeb-1jammy libcephfs2=19.2.3-678-ge911bdeb-1jammy libcephfs-dev=19.2.3-678-ge911bdeb-1jammy librados2=19.2.3-678-ge911bdeb-1jammy librbd1=19.2.3-678-ge911bdeb-1jammy rbd-fuse=19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:39:08.671 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:39:08.858 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:39:08.858 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:39:09.019 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:39:09.020 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:39:09.020 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:39:09.020 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:39:09.020 INFO:teuthology.orchestra.run.vm01.stdout:The following additional packages will be installed: 2026-03-08T22:39:09.020 INFO:teuthology.orchestra.run.vm01.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-08T22:39:09.020 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsqlite3-mod-ceph 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: libthrift-0.16.0 lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:39:09.021 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-pytest python3-repoze.lru 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-toml python3-waitress python3-wcwidth python3-webob 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-webtest python3-werkzeug python3-zc.lockfile 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: qttranslations5-l10n smartmontools socat unzip xmlstarlet zip 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout:Suggested packages: 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python3-influxdb readline-doc python3-beaker python-mako-doc 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python-natsort-doc httpd-wsgi libapache2-mod-python libapache2-mod-scgi 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: libjs-mochikit python-pecan-doc python-psutil-doc subversion 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python-pygments-doc ttf-bitstream-vera python-pyinotify-doc python3-dap 2026-03-08T22:39:09.022 INFO:teuthology.orchestra.run.vm01.stdout: python-sklearn-doc ipython3 python-waitress-doc python-webob-doc 2026-03-08T22:39:09.023 INFO:teuthology.orchestra.run.vm01.stdout: python-webtest-doc python-werkzeug-doc python3-watchdog gsmartcontrol 2026-03-08T22:39:09.023 INFO:teuthology.orchestra.run.vm01.stdout: smart-notifier mailx | mailutils 2026-03-08T22:39:09.023 INFO:teuthology.orchestra.run.vm01.stdout:Recommended packages: 2026-03-08T22:39:09.023 INFO:teuthology.orchestra.run.vm01.stdout: btrfs-tools 2026-03-08T22:39:09.064 INFO:teuthology.orchestra.run.vm01.stdout:The following NEW packages will be installed: 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-dev libcephfs2 libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: liblua5.3-dev libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 libreadline-dev 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: librgw2 libsqlite3-mod-ceph libthrift-0.16.0 lua-any lua-sec lua-socket 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: lua5.1 luarocks nvme-cli pkg-config python-asyncssh-doc 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:39:09.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-pytest python3-rados python3-rbd 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-repoze.lru python3-requests-oauthlib python3-rgw python3-routes 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-threadpoolctl python3-toml python3-waitress python3-wcwidth 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse smartmontools 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout: socat unzip xmlstarlet zip 2026-03-08T22:39:09.066 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be upgraded: 2026-03-08T22:39:09.067 INFO:teuthology.orchestra.run.vm01.stdout: librados2 librbd1 2026-03-08T22:39:09.163 INFO:teuthology.orchestra.run.vm01.stdout:2 upgraded, 107 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:39:09.163 INFO:teuthology.orchestra.run.vm01.stdout:Need to get 178 MB of archives. 2026-03-08T22:39:09.163 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 782 MB of additional disk space will be used. 2026-03-08T22:39:09.163 INFO:teuthology.orchestra.run.vm01.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-08T22:39:09.200 INFO:teuthology.orchestra.run.vm01.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-08T22:39:09.200 INFO:teuthology.orchestra.run.vm01.stdout:Get:3 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-08T22:39:09.207 INFO:teuthology.orchestra.run.vm01.stdout:Get:4 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-08T22:39:09.232 INFO:teuthology.orchestra.run.vm01.stdout:Get:5 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-08T22:39:09.233 INFO:teuthology.orchestra.run.vm01.stdout:Get:6 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-08T22:39:09.247 INFO:teuthology.orchestra.run.vm01.stdout:Get:7 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-08T22:39:09.248 INFO:teuthology.orchestra.run.vm01.stdout:Get:8 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-08T22:39:09.249 INFO:teuthology.orchestra.run.vm01.stdout:Get:9 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-08T22:39:09.249 INFO:teuthology.orchestra.run.vm01.stdout:Get:10 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-08T22:39:09.249 INFO:teuthology.orchestra.run.vm01.stdout:Get:11 https://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-08T22:39:09.252 INFO:teuthology.orchestra.run.vm01.stdout:Get:12 https://archive.ubuntu.com/ubuntu jammy/main amd64 libreadline-dev amd64 8.1.2-1 [166 kB] 2026-03-08T22:39:09.253 INFO:teuthology.orchestra.run.vm01.stdout:Get:13 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblua5.3-dev amd64 5.3.6-1build1 [167 kB] 2026-03-08T22:39:09.254 INFO:teuthology.orchestra.run.vm01.stdout:Get:14 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua5.1 amd64 5.1.5-8.1build4 [94.6 kB] 2026-03-08T22:39:09.255 INFO:teuthology.orchestra.run.vm01.stdout:Get:15 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-any all 27ubuntu1 [5034 B] 2026-03-08T22:39:09.259 INFO:teuthology.orchestra.run.vm01.stdout:Get:16 https://archive.ubuntu.com/ubuntu jammy/main amd64 zip amd64 3.0-12build2 [176 kB] 2026-03-08T22:39:09.261 INFO:teuthology.orchestra.run.vm01.stdout:Get:17 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 unzip amd64 6.0-26ubuntu3.2 [175 kB] 2026-03-08T22:39:09.262 INFO:teuthology.orchestra.run.vm01.stdout:Get:18 https://archive.ubuntu.com/ubuntu jammy/universe amd64 luarocks all 3.8.0+dfsg1-1 [140 kB] 2026-03-08T22:39:09.263 INFO:teuthology.orchestra.run.vm01.stdout:Get:19 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-08T22:39:09.264 INFO:teuthology.orchestra.run.vm01.stdout:Get:20 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-08T22:39:09.266 INFO:teuthology.orchestra.run.vm01.stdout:Get:21 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-08T22:39:09.267 INFO:teuthology.orchestra.run.vm01.stdout:Get:22 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-08T22:39:09.267 INFO:teuthology.orchestra.run.vm01.stdout:Get:23 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-08T22:39:09.268 INFO:teuthology.orchestra.run.vm01.stdout:Get:24 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-08T22:39:09.268 INFO:teuthology.orchestra.run.vm01.stdout:Get:25 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-08T22:39:09.274 INFO:teuthology.orchestra.run.vm01.stdout:Get:26 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-08T22:39:09.274 INFO:teuthology.orchestra.run.vm01.stdout:Get:27 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-08T22:39:09.274 INFO:teuthology.orchestra.run.vm01.stdout:Get:28 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-08T22:39:09.276 INFO:teuthology.orchestra.run.vm01.stdout:Get:29 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-08T22:39:09.276 INFO:teuthology.orchestra.run.vm01.stdout:Get:30 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-logutils all 0.3.3-8 [17.6 kB] 2026-03-08T22:39:09.281 INFO:teuthology.orchestra.run.vm01.stdout:Get:31 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-mako all 1.1.3+ds1-2ubuntu0.1 [60.5 kB] 2026-03-08T22:39:09.282 INFO:teuthology.orchestra.run.vm01.stdout:Get:32 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplegeneric all 0.8.1-3 [11.3 kB] 2026-03-08T22:39:09.282 INFO:teuthology.orchestra.run.vm01.stdout:Get:33 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-singledispatch all 3.4.0.3-3 [7320 B] 2026-03-08T22:39:09.282 INFO:teuthology.orchestra.run.vm01.stdout:Get:34 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-08T22:39:09.283 INFO:teuthology.orchestra.run.vm01.stdout:Get:35 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-waitress all 1.4.4-1.1ubuntu1.1 [47.0 kB] 2026-03-08T22:39:09.288 INFO:teuthology.orchestra.run.vm01.stdout:Get:36 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempita all 0.5.2-6ubuntu1 [15.1 kB] 2026-03-08T22:39:09.289 INFO:teuthology.orchestra.run.vm01.stdout:Get:37 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-paste all 3.5.0+dfsg1-1 [456 kB] 2026-03-08T22:39:09.292 INFO:teuthology.orchestra.run.vm01.stdout:Get:38 https://archive.ubuntu.com/ubuntu jammy/main amd64 python-pastedeploy-tpl all 2.1.1-1 [4892 B] 2026-03-08T22:39:09.293 INFO:teuthology.orchestra.run.vm01.stdout:Get:39 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastedeploy all 2.1.1-1 [26.6 kB] 2026-03-08T22:39:09.293 INFO:teuthology.orchestra.run.vm01.stdout:Get:40 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-webtest all 2.0.35-1 [28.5 kB] 2026-03-08T22:39:09.296 INFO:teuthology.orchestra.run.vm01.stdout:Get:41 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pecan all 1.3.3-4ubuntu2 [87.3 kB] 2026-03-08T22:39:09.297 INFO:teuthology.orchestra.run.vm01.stdout:Get:42 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-werkzeug all 2.0.2+dfsg1-1ubuntu0.22.04.3 [181 kB] 2026-03-08T22:39:09.298 INFO:teuthology.orchestra.run.vm01.stdout:Get:43 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-08T22:39:09.299 INFO:teuthology.orchestra.run.vm01.stdout:Get:44 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-08T22:39:09.301 INFO:teuthology.orchestra.run.vm01.stdout:Get:45 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-08T22:39:09.303 INFO:teuthology.orchestra.run.vm01.stdout:Get:46 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-08T22:39:09.304 INFO:teuthology.orchestra.run.vm01.stdout:Get:47 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-08T22:39:09.332 INFO:teuthology.orchestra.run.vm01.stdout:Get:48 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-08T22:39:09.333 INFO:teuthology.orchestra.run.vm01.stdout:Get:49 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-08T22:39:09.333 INFO:teuthology.orchestra.run.vm01.stdout:Get:50 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-08T22:39:09.347 INFO:teuthology.orchestra.run.vm01.stdout:Get:51 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-08T22:39:09.348 INFO:teuthology.orchestra.run.vm01.stdout:Get:52 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-08T22:39:09.348 INFO:teuthology.orchestra.run.vm01.stdout:Get:53 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-08T22:39:09.349 INFO:teuthology.orchestra.run.vm01.stdout:Get:54 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-08T22:39:09.349 INFO:teuthology.orchestra.run.vm01.stdout:Get:55 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-08T22:39:09.350 INFO:teuthology.orchestra.run.vm01.stdout:Get:56 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-08T22:39:09.352 INFO:teuthology.orchestra.run.vm01.stdout:Get:57 https://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-08T22:39:09.353 INFO:teuthology.orchestra.run.vm01.stdout:Get:58 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-08T22:39:09.354 INFO:teuthology.orchestra.run.vm01.stdout:Get:59 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-08T22:39:09.357 INFO:teuthology.orchestra.run.vm01.stdout:Get:60 https://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-08T22:39:09.361 INFO:teuthology.orchestra.run.vm01.stdout:Get:61 https://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-08T22:39:09.363 INFO:teuthology.orchestra.run.vm01.stdout:Get:62 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-socket amd64 3.0~rc1+git+ac3201d-6 [78.9 kB] 2026-03-08T22:39:09.364 INFO:teuthology.orchestra.run.vm01.stdout:Get:63 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-sec amd64 1.0.2-1 [37.6 kB] 2026-03-08T22:39:09.364 INFO:teuthology.orchestra.run.vm01.stdout:Get:64 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-08T22:39:09.376 INFO:teuthology.orchestra.run.vm01.stdout:Get:65 https://archive.ubuntu.com/ubuntu jammy/main amd64 pkg-config amd64 0.29.2-1ubuntu3 [48.2 kB] 2026-03-08T22:39:09.377 INFO:teuthology.orchestra.run.vm01.stdout:Get:66 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-08T22:39:09.379 INFO:teuthology.orchestra.run.vm01.stdout:Get:67 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-08T22:39:09.379 INFO:teuthology.orchestra.run.vm01.stdout:Get:68 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastescript all 2.0.2-4 [54.6 kB] 2026-03-08T22:39:09.380 INFO:teuthology.orchestra.run.vm01.stdout:Get:69 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-08T22:39:09.380 INFO:teuthology.orchestra.run.vm01.stdout:Get:70 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-08T22:39:09.381 INFO:teuthology.orchestra.run.vm01.stdout:Get:71 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-08T22:39:09.382 INFO:teuthology.orchestra.run.vm01.stdout:Get:72 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-08T22:39:09.387 INFO:teuthology.orchestra.run.vm01.stdout:Get:73 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pyinotify all 0.9.6-1.3 [24.8 kB] 2026-03-08T22:39:09.387 INFO:teuthology.orchestra.run.vm01.stdout:Get:74 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-08T22:39:09.394 INFO:teuthology.orchestra.run.vm01.stdout:Get:75 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-08T22:39:09.396 INFO:teuthology.orchestra.run.vm01.stdout:Get:76 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-08T22:39:09.397 INFO:teuthology.orchestra.run.vm01.stdout:Get:77 https://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-08T22:39:09.414 INFO:teuthology.orchestra.run.vm01.stdout:Get:78 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-08T22:39:09.663 INFO:teuthology.orchestra.run.vm01.stdout:Get:79 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 19.2.3-678-ge911bdeb-1jammy [3257 kB] 2026-03-08T22:39:10.469 INFO:teuthology.orchestra.run.vm01.stdout:Get:80 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 19.2.3-678-ge911bdeb-1jammy [3597 kB] 2026-03-08T22:39:10.591 INFO:teuthology.orchestra.run.vm01.stdout:Get:81 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 19.2.3-678-ge911bdeb-1jammy [979 kB] 2026-03-08T22:39:10.604 INFO:teuthology.orchestra.run.vm01.stdout:Get:82 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 19.2.3-678-ge911bdeb-1jammy [357 kB] 2026-03-08T22:39:10.608 INFO:teuthology.orchestra.run.vm01.stdout:Get:83 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 19.2.3-678-ge911bdeb-1jammy [32.9 kB] 2026-03-08T22:39:10.609 INFO:teuthology.orchestra.run.vm01.stdout:Get:84 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 19.2.3-678-ge911bdeb-1jammy [184 kB] 2026-03-08T22:39:10.613 INFO:teuthology.orchestra.run.vm01.stdout:Get:85 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 19.2.3-678-ge911bdeb-1jammy [70.1 kB] 2026-03-08T22:39:10.614 INFO:teuthology.orchestra.run.vm01.stdout:Get:86 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 19.2.3-678-ge911bdeb-1jammy [334 kB] 2026-03-08T22:39:10.619 INFO:teuthology.orchestra.run.vm01.stdout:Get:87 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 19.2.3-678-ge911bdeb-1jammy [6935 kB] 2026-03-08T22:39:10.937 INFO:teuthology.orchestra.run.vm01.stdout:Get:88 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 19.2.3-678-ge911bdeb-1jammy [112 kB] 2026-03-08T22:39:10.937 INFO:teuthology.orchestra.run.vm01.stdout:Get:89 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 19.2.3-678-ge911bdeb-1jammy [470 kB] 2026-03-08T22:39:10.940 INFO:teuthology.orchestra.run.vm01.stdout:Get:90 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 19.2.3-678-ge911bdeb-1jammy [26.5 MB] 2026-03-08T22:39:12.029 INFO:teuthology.orchestra.run.vm01.stdout:Get:91 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 19.2.3-678-ge911bdeb-1jammy [5178 kB] 2026-03-08T22:39:12.248 INFO:teuthology.orchestra.run.vm01.stdout:Get:92 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 19.2.3-678-ge911bdeb-1jammy [248 kB] 2026-03-08T22:39:12.249 INFO:teuthology.orchestra.run.vm01.stdout:Get:93 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 19.2.3-678-ge911bdeb-1jammy [125 kB] 2026-03-08T22:39:12.251 INFO:teuthology.orchestra.run.vm01.stdout:Get:94 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 19.2.3-678-ge911bdeb-1jammy [1081 kB] 2026-03-08T22:39:12.269 INFO:teuthology.orchestra.run.vm01.stdout:Get:95 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 19.2.3-678-ge911bdeb-1jammy [6239 kB] 2026-03-08T22:39:12.497 INFO:teuthology.orchestra.run.vm01.stdout:Get:96 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 19.2.3-678-ge911bdeb-1jammy [23.0 MB] 2026-03-08T22:39:13.369 INFO:teuthology.orchestra.run.vm01.stdout:Get:97 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 19.2.3-678-ge911bdeb-1jammy [14.2 kB] 2026-03-08T22:39:13.370 INFO:teuthology.orchestra.run.vm01.stdout:Get:98 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 19.2.3-678-ge911bdeb-1jammy [1173 kB] 2026-03-08T22:39:13.421 INFO:teuthology.orchestra.run.vm01.stdout:Get:99 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 19.2.3-678-ge911bdeb-1jammy [2503 kB] 2026-03-08T22:39:13.504 INFO:teuthology.orchestra.run.vm01.stdout:Get:100 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 19.2.3-678-ge911bdeb-1jammy [798 kB] 2026-03-08T22:39:13.538 INFO:teuthology.orchestra.run.vm01.stdout:Get:101 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 19.2.3-678-ge911bdeb-1jammy [157 kB] 2026-03-08T22:39:13.539 INFO:teuthology.orchestra.run.vm01.stdout:Get:102 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 19.2.3-678-ge911bdeb-1jammy [2396 kB] 2026-03-08T22:39:13.650 INFO:teuthology.orchestra.run.vm01.stdout:Get:103 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 19.2.3-678-ge911bdeb-1jammy [8625 kB] 2026-03-08T22:39:13.941 INFO:teuthology.orchestra.run.vm01.stdout:Get:104 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 19.2.3-678-ge911bdeb-1jammy [14.3 kB] 2026-03-08T22:39:13.941 INFO:teuthology.orchestra.run.vm01.stdout:Get:105 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 19.2.3-678-ge911bdeb-1jammy [52.1 MB] 2026-03-08T22:39:15.884 INFO:teuthology.orchestra.run.vm01.stdout:Get:106 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 19.2.3-678-ge911bdeb-1jammy [135 kB] 2026-03-08T22:39:15.884 INFO:teuthology.orchestra.run.vm01.stdout:Get:107 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 19.2.3-678-ge911bdeb-1jammy [41.0 kB] 2026-03-08T22:39:15.884 INFO:teuthology.orchestra.run.vm01.stdout:Get:108 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 19.2.3-678-ge911bdeb-1jammy [13.7 MB] 2026-03-08T22:39:16.385 INFO:teuthology.orchestra.run.vm01.stdout:Get:109 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 19.2.3-678-ge911bdeb-1jammy [92.2 kB] 2026-03-08T22:39:16.702 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 178 MB in 7s (24.4 MB/s) 2026-03-08T22:39:17.016 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-08T22:39:17.044 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 111717 files and directories currently installed.) 2026-03-08T22:39:17.046 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../000-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-08T22:39:17.048 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:39:17.072 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-08T22:39:17.077 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../001-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-08T22:39:17.078 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:39:17.117 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-08T22:39:17.122 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../002-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-08T22:39:17.123 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:39:17.146 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-08T22:39:17.152 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../003-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:39:17.156 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:39:17.199 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-08T22:39:17.204 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../004-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:39:17.205 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:39:17.224 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-08T22:39:17.229 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../005-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:39:17.230 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:39:17.258 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-08T22:39:17.263 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../006-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-08T22:39:17.264 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:39:17.289 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../007-librbd1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.291 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librbd1 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-08T22:39:17.371 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../008-librados2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.373 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librados2 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-08T22:39:17.442 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libnbd0. 2026-03-08T22:39:17.448 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../009-libnbd0_1.10.5-1_amd64.deb ... 2026-03-08T22:39:17.449 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-08T22:39:17.466 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs2. 2026-03-08T22:39:17.472 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../010-libcephfs2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.472 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:17.503 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rados. 2026-03-08T22:39:17.509 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../011-python3-rados_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.509 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:17.531 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-08T22:39:17.537 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../012-python3-ceph-argparse_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:17.537 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:17.552 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cephfs. 2026-03-08T22:39:17.558 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../013-python3-cephfs_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.558 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:17.575 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-08T22:39:17.579 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../014-python3-ceph-common_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:17.579 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:17.597 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-08T22:39:17.602 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../015-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-08T22:39:17.603 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:39:17.622 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-prettytable. 2026-03-08T22:39:17.629 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../016-python3-prettytable_2.5.0-2_all.deb ... 2026-03-08T22:39:17.629 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-08T22:39:17.646 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rbd. 2026-03-08T22:39:17.652 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../017-python3-rbd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.652 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:17.673 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-08T22:39:17.678 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../018-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-08T22:39:17.679 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:39:17.700 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libreadline-dev:amd64. 2026-03-08T22:39:17.705 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../019-libreadline-dev_8.1.2-1_amd64.deb ... 2026-03-08T22:39:17.706 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:39:17.724 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liblua5.3-dev:amd64. 2026-03-08T22:39:17.730 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../020-liblua5.3-dev_5.3.6-1build1_amd64.deb ... 2026-03-08T22:39:17.730 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:39:17.750 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua5.1. 2026-03-08T22:39:17.755 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../021-lua5.1_5.1.5-8.1build4_amd64.deb ... 2026-03-08T22:39:17.756 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:39:17.774 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua-any. 2026-03-08T22:39:17.779 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../022-lua-any_27ubuntu1_all.deb ... 2026-03-08T22:39:17.780 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua-any (27ubuntu1) ... 2026-03-08T22:39:17.793 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package zip. 2026-03-08T22:39:17.798 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../023-zip_3.0-12build2_amd64.deb ... 2026-03-08T22:39:17.799 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking zip (3.0-12build2) ... 2026-03-08T22:39:17.817 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package unzip. 2026-03-08T22:39:17.823 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../024-unzip_6.0-26ubuntu3.2_amd64.deb ... 2026-03-08T22:39:17.824 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:39:17.844 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package luarocks. 2026-03-08T22:39:17.850 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../025-luarocks_3.8.0+dfsg1-1_all.deb ... 2026-03-08T22:39:17.851 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:39:17.901 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package librgw2. 2026-03-08T22:39:17.906 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../026-librgw2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:17.907 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:18.056 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rgw. 2026-03-08T22:39:18.062 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../027-python3-rgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:18.063 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:18.082 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-08T22:39:18.088 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../028-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-08T22:39:18.089 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:39:18.108 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libradosstriper1. 2026-03-08T22:39:18.115 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../029-libradosstriper1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:18.116 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:18.144 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-common. 2026-03-08T22:39:18.149 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../030-ceph-common_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:18.150 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:18.539 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-base. 2026-03-08T22:39:18.544 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../031-ceph-base_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:18.549 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:18.656 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-08T22:39:18.662 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../032-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-08T22:39:18.664 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:39:18.678 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cheroot. 2026-03-08T22:39:18.683 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../033-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-08T22:39:18.684 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:39:18.703 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-08T22:39:18.708 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../034-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-08T22:39:18.708 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:39:18.723 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-08T22:39:18.728 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../035-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-08T22:39:18.729 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:39:18.743 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-08T22:39:18.748 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../036-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-08T22:39:18.749 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:39:18.765 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-tempora. 2026-03-08T22:39:18.771 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../037-python3-tempora_4.1.2-1_all.deb ... 2026-03-08T22:39:18.771 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-08T22:39:18.789 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-portend. 2026-03-08T22:39:18.794 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../038-python3-portend_3.0.0-1_all.deb ... 2026-03-08T22:39:18.795 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-08T22:39:18.811 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-08T22:39:18.815 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../039-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-08T22:39:18.816 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-08T22:39:18.833 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-08T22:39:18.838 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../040-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-08T22:39:18.839 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:39:18.867 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-natsort. 2026-03-08T22:39:18.873 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../041-python3-natsort_8.0.2-1_all.deb ... 2026-03-08T22:39:18.873 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-08T22:39:18.889 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-logutils. 2026-03-08T22:39:18.894 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../042-python3-logutils_0.3.3-8_all.deb ... 2026-03-08T22:39:18.895 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-logutils (0.3.3-8) ... 2026-03-08T22:39:18.908 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-mako. 2026-03-08T22:39:18.913 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../043-python3-mako_1.1.3+ds1-2ubuntu0.1_all.deb ... 2026-03-08T22:39:18.914 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:39:18.932 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-simplegeneric. 2026-03-08T22:39:18.937 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../044-python3-simplegeneric_0.8.1-3_all.deb ... 2026-03-08T22:39:18.938 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:39:18.952 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-singledispatch. 2026-03-08T22:39:18.957 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../045-python3-singledispatch_3.4.0.3-3_all.deb ... 2026-03-08T22:39:18.962 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:39:18.979 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-webob. 2026-03-08T22:39:18.984 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../046-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-08T22:39:18.984 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:39:19.004 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-waitress. 2026-03-08T22:39:19.009 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../047-python3-waitress_1.4.4-1.1ubuntu1.1_all.deb ... 2026-03-08T22:39:19.011 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:39:19.028 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-tempita. 2026-03-08T22:39:19.033 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../048-python3-tempita_0.5.2-6ubuntu1_all.deb ... 2026-03-08T22:39:19.033 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:39:19.049 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-paste. 2026-03-08T22:39:19.054 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../049-python3-paste_3.5.0+dfsg1-1_all.deb ... 2026-03-08T22:39:19.055 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:39:19.087 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python-pastedeploy-tpl. 2026-03-08T22:39:19.092 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../050-python-pastedeploy-tpl_2.1.1-1_all.deb ... 2026-03-08T22:39:19.093 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:39:19.108 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pastedeploy. 2026-03-08T22:39:19.113 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../051-python3-pastedeploy_2.1.1-1_all.deb ... 2026-03-08T22:39:19.114 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:39:19.130 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-webtest. 2026-03-08T22:39:19.135 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../052-python3-webtest_2.0.35-1_all.deb ... 2026-03-08T22:39:19.136 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-webtest (2.0.35-1) ... 2026-03-08T22:39:19.161 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pecan. 2026-03-08T22:39:19.166 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../053-python3-pecan_1.3.3-4ubuntu2_all.deb ... 2026-03-08T22:39:19.167 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:39:19.198 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-werkzeug. 2026-03-08T22:39:19.203 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../054-python3-werkzeug_2.0.2+dfsg1-1ubuntu0.22.04.3_all.deb ... 2026-03-08T22:39:19.204 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:39:19.227 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-08T22:39:19.232 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../055-ceph-mgr-modules-core_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:19.233 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.270 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-08T22:39:19.275 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../056-libsqlite3-mod-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.276 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.293 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr. 2026-03-08T22:39:19.300 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../057-ceph-mgr_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.300 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.331 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mon. 2026-03-08T22:39:19.338 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../058-ceph-mon_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.339 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.442 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-08T22:39:19.448 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../059-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-08T22:39:19.449 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:39:19.467 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-osd. 2026-03-08T22:39:19.472 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../060-ceph-osd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.472 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.781 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph. 2026-03-08T22:39:19.787 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../061-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.787 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.804 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-fuse. 2026-03-08T22:39:19.808 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../062-ceph-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.809 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.842 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mds. 2026-03-08T22:39:19.847 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../063-ceph-mds_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.848 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.898 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package cephadm. 2026-03-08T22:39:19.904 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../064-cephadm_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:19.905 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.929 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-08T22:39:19.935 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../065-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-08T22:39:19.936 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:39:19.965 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-08T22:39:19.972 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../066-ceph-mgr-cephadm_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:19.973 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:19.998 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-08T22:39:20.004 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../067-python3-repoze.lru_0.7-2_all.deb ... 2026-03-08T22:39:20.004 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-08T22:39:20.023 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-routes. 2026-03-08T22:39:20.028 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../068-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-08T22:39:20.028 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:39:20.052 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-08T22:39:20.056 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../069-ceph-mgr-dashboard_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:20.057 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:20.425 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-08T22:39:20.431 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../070-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-08T22:39:20.432 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:39:20.496 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-joblib. 2026-03-08T22:39:20.503 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../071-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-08T22:39:20.504 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:39:20.541 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-08T22:39:20.546 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../072-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-08T22:39:20.547 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:39:20.565 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-sklearn. 2026-03-08T22:39:20.572 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../073-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-08T22:39:20.573 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:39:20.701 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-08T22:39:20.707 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../074-ceph-mgr-diskprediction-local_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:20.707 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:21.131 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cachetools. 2026-03-08T22:39:21.137 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../075-python3-cachetools_5.0.0-1_all.deb ... 2026-03-08T22:39:21.138 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-08T22:39:21.156 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rsa. 2026-03-08T22:39:21.163 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../076-python3-rsa_4.8-1_all.deb ... 2026-03-08T22:39:21.278 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-08T22:39:21.571 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-google-auth. 2026-03-08T22:39:21.576 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../077-python3-google-auth_1.5.1-3_all.deb ... 2026-03-08T22:39:21.719 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-08T22:39:21.811 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-08T22:39:21.817 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../078-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-08T22:39:21.817 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:39:21.838 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-websocket. 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../079-python3-websocket_1.2.3-1_all.deb ... 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-08T22:39:21.868 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-08T22:39:21.874 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../080-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-08T22:39:21.891 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:39:22.061 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-08T22:39:22.067 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../081-ceph-mgr-k8sevents_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:22.068 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:22.081 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-08T22:39:22.086 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../082-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-08T22:39:22.087 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:39:22.103 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-08T22:39:22.107 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../083-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-08T22:39:22.108 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:39:22.122 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package jq. 2026-03-08T22:39:22.126 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../084-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-08T22:39:22.127 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:39:22.139 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package socat. 2026-03-08T22:39:22.143 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../085-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-08T22:39:22.144 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:39:22.174 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package xmlstarlet. 2026-03-08T22:39:22.178 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../086-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-08T22:39:22.178 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:39:22.225 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-test. 2026-03-08T22:39:22.229 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../087-ceph-test_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:22.230 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:23.101 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-volume. 2026-03-08T22:39:23.107 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../088-ceph-volume_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:39:23.108 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:23.137 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-08T22:39:23.143 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../089-libcephfs-dev_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:23.144 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:23.160 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua-socket:amd64. 2026-03-08T22:39:23.166 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../090-lua-socket_3.0~rc1+git+ac3201d-6_amd64.deb ... 2026-03-08T22:39:23.168 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:39:23.193 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua-sec:amd64. 2026-03-08T22:39:23.200 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../091-lua-sec_1.0.2-1_amd64.deb ... 2026-03-08T22:39:23.201 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:39:23.221 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package nvme-cli. 2026-03-08T22:39:23.227 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../092-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-08T22:39:23.228 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:39:23.268 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package pkg-config. 2026-03-08T22:39:23.274 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../093-pkg-config_0.29.2-1ubuntu3_amd64.deb ... 2026-03-08T22:39:23.276 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:39:23.291 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-08T22:39:23.296 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../094-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-08T22:39:23.297 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:39:23.343 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-08T22:39:23.350 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../095-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-08T22:39:23.351 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-08T22:39:23.369 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pastescript. 2026-03-08T22:39:23.376 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../096-python3-pastescript_2.0.2-4_all.deb ... 2026-03-08T22:39:23.376 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pastescript (2.0.2-4) ... 2026-03-08T22:39:23.394 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pluggy. 2026-03-08T22:39:23.399 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../097-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-08T22:39:23.400 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-08T22:39:23.415 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-psutil. 2026-03-08T22:39:23.419 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../098-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-08T22:39:23.419 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-08T22:39:23.445 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-py. 2026-03-08T22:39:23.449 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../099-python3-py_1.10.0-1_all.deb ... 2026-03-08T22:39:23.450 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-08T22:39:23.470 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pygments. 2026-03-08T22:39:23.474 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../100-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-08T22:39:23.475 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-08T22:39:23.538 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pyinotify. 2026-03-08T22:39:23.544 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../101-python3-pyinotify_0.9.6-1.3_all.deb ... 2026-03-08T22:39:23.545 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:39:23.562 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-toml. 2026-03-08T22:39:23.569 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../102-python3-toml_0.10.2-1_all.deb ... 2026-03-08T22:39:23.570 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-08T22:39:23.587 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pytest. 2026-03-08T22:39:23.593 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../103-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-08T22:39:23.594 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-08T22:39:23.623 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-simplejson. 2026-03-08T22:39:23.630 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../104-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-08T22:39:23.631 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:39:23.652 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-08T22:39:23.658 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../105-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-08T22:39:23.659 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:39:23.803 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package radosgw. 2026-03-08T22:39:23.809 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../106-radosgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:23.809 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:24.032 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package rbd-fuse. 2026-03-08T22:39:24.033 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../107-rbd-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:39:24.034 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:24.051 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package smartmontools. 2026-03-08T22:39:24.057 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../108-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-08T22:39:24.065 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:39:24.113 INFO:teuthology.orchestra.run.vm01.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:39:24.340 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-08T22:39:24.340 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-08T22:39:24.706 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-08T22:39:24.771 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:39:24.773 INFO:teuthology.orchestra.run.vm01.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:39:24.832 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-08T22:39:25.077 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-08T22:39:25.434 INFO:teuthology.orchestra.run.vm01.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-08T22:39:25.442 INFO:teuthology.orchestra.run.vm01.stdout:Could not execute systemctl: at /usr/bin/deb-systemd-invoke line 142. 2026-03-08T22:39:25.445 INFO:teuthology.orchestra.run.vm01.stdout:Setting up cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:25.488 INFO:teuthology.orchestra.run.vm01.stdout:Adding system user cephadm....done 2026-03-08T22:39:25.498 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:39:25.572 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:39:25.635 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:39:25.637 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:39:25.700 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-08T22:39:25.767 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:39:25.770 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-08T22:39:25.858 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:39:25.978 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-08T22:39:26.050 INFO:teuthology.orchestra.run.vm01.stdout:Setting up unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:39:26.058 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:39:26.129 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:39:26.200 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:26.273 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:39:26.275 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-08T22:39:26.278 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:39:26.280 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:39:26.282 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:39:26.284 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:39:26.288 INFO:teuthology.orchestra.run.vm01.stdout:update-alternatives: using /usr/bin/lua5.1 to provide /usr/bin/lua (lua-interpreter) in auto mode 2026-03-08T22:39:26.290 INFO:teuthology.orchestra.run.vm01.stdout:update-alternatives: using /usr/bin/luac5.1 to provide /usr/bin/luac (lua-compiler) in auto mode 2026-03-08T22:39:26.292 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:39:26.295 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-08T22:39:26.414 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-08T22:39:26.484 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:39:26.552 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:39:26.631 INFO:teuthology.orchestra.run.vm01.stdout:Setting up zip (3.0-12build2) ... 2026-03-08T22:39:26.634 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-08T22:39:26.918 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:39:26.994 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:39:26.996 INFO:teuthology.orchestra.run.vm01.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:39:26.999 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:39:27.093 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:39:27.237 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:39:27.366 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:39:27.452 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:39:27.564 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:39:27.628 INFO:teuthology.orchestra.run.vm01.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:39:27.631 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:27.724 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:39:28.286 INFO:teuthology.orchestra.run.vm01.stdout:Setting up pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:39:28.306 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:39:28.310 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-08T22:39:28.382 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:39:28.384 INFO:teuthology.orchestra.run.vm01.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:39:28.386 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-08T22:39:28.454 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-08T22:39:28.520 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:39:28.522 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-08T22:39:28.596 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:39:28.665 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-logutils (0.3.3-8) ... 2026-03-08T22:39:28.738 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-08T22:39:28.805 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:39:28.869 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-08T22:39:28.938 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:39:28.941 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-08T22:39:29.019 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:39:29.021 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:39:29.086 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:39:29.167 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:39:29.260 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:39:29.330 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:39:29.332 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:39:29.334 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:39:29.336 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-08T22:39:29.474 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:39:29.544 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua-any (27ubuntu1) ... 2026-03-08T22:39:29.546 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-08T22:39:29.616 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:39:29.618 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-08T22:39:29.694 INFO:teuthology.orchestra.run.vm01.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:39:29.696 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-webtest (2.0.35-1) ... 2026-03-08T22:39:29.766 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:39:29.896 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pastescript (2.0.2-4) ... 2026-03-08T22:39:29.984 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:39:30.107 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:39:30.109 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.111 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.113 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:39:30.674 INFO:teuthology.orchestra.run.vm01.stdout:Setting up luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:39:30.680 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.683 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.685 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.687 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.689 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:30.748 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-08T22:39:30.748 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-08T22:39:31.086 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.088 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.090 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.092 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.095 INFO:teuthology.orchestra.run.vm01.stdout:Setting up rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.097 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.098 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.101 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:31.134 INFO:teuthology.orchestra.run.vm01.stdout:Adding group ceph....done 2026-03-08T22:39:31.171 INFO:teuthology.orchestra.run.vm01.stdout:Adding system user ceph....done 2026-03-08T22:39:31.180 INFO:teuthology.orchestra.run.vm01.stdout:Setting system user ceph properties....done 2026-03-08T22:39:31.183 INFO:teuthology.orchestra.run.vm01.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-08T22:39:31.249 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-08T22:39:31.484 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-08T22:39:31.897 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:32.141 INFO:teuthology.orchestra.run.vm01.stdout:Setting up radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:32.583 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:39:32.583 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:39:32.942 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:33.025 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-08T22:39:33.423 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:33.505 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-08T22:39:33.505 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-08T22:39:33.838 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:33.900 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-08T22:39:33.900 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-08T22:39:34.292 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:34.368 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-08T22:39:34.368 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-08T22:39:34.792 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:34.793 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:34.806 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:34.864 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-08T22:39:34.864 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-08T22:39:35.226 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:35.240 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:35.242 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:35.256 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:39:35.568 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-08T22:39:35.577 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:39:35.591 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:39:35.673 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for install-info (6.8-4build1) ... 2026-03-08T22:39:36.045 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:36.045 INFO:teuthology.orchestra.run.vm01.stdout:Running kernel seems to be up-to-date. 2026-03-08T22:39:36.045 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:36.045 INFO:teuthology.orchestra.run.vm01.stdout:Services to be restarted: 2026-03-08T22:39:36.047 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart packagekit.service 2026-03-08T22:39:36.050 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:36.050 INFO:teuthology.orchestra.run.vm01.stdout:Service restarts being deferred: 2026-03-08T22:39:36.050 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart unattended-upgrades.service 2026-03-08T22:39:36.050 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:36.051 INFO:teuthology.orchestra.run.vm01.stdout:No containers need to be restarted. 2026-03-08T22:39:36.051 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:36.051 INFO:teuthology.orchestra.run.vm01.stdout:No user sessions are running outdated binaries. 2026-03-08T22:39:36.051 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:36.051 INFO:teuthology.orchestra.run.vm01.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-08T22:39:36.898 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:39:36.900 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-xmltodict python3-jmespath 2026-03-08T22:39:36.975 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:39:37.167 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:39:37.168 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:39:37.298 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:39:37.298 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:39:37.299 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:39:37.299 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:39:37.314 INFO:teuthology.orchestra.run.vm01.stdout:The following NEW packages will be installed: 2026-03-08T22:39:37.314 INFO:teuthology.orchestra.run.vm01.stdout: python3-jmespath python3-xmltodict 2026-03-08T22:39:37.398 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 2 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:39:37.399 INFO:teuthology.orchestra.run.vm01.stdout:Need to get 34.3 kB of archives. 2026-03-08T22:39:37.399 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 146 kB of additional disk space will be used. 2026-03-08T22:39:37.399 INFO:teuthology.orchestra.run.vm01.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-08T22:39:37.415 INFO:teuthology.orchestra.run.vm01.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-08T22:39:37.606 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 34.3 kB in 0s (349 kB/s) 2026-03-08T22:39:37.620 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jmespath. 2026-03-08T22:39:37.651 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118577 files and directories currently installed.) 2026-03-08T22:39:37.653 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-08T22:39:37.654 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-08T22:39:37.674 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-08T22:39:37.680 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-08T22:39:37.712 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-08T22:39:37.739 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-08T22:39:37.807 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-08T22:39:38.214 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:38.215 INFO:teuthology.orchestra.run.vm01.stdout:Running kernel seems to be up-to-date. 2026-03-08T22:39:38.215 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:38.215 INFO:teuthology.orchestra.run.vm01.stdout:Services to be restarted: 2026-03-08T22:39:38.217 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart packagekit.service 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout:Service restarts being deferred: 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart unattended-upgrades.service 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout:No containers need to be restarted. 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout:No user sessions are running outdated binaries. 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-08T22:39:38.220 INFO:teuthology.orchestra.run.vm01.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-08T22:39:39.166 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:39:39.170 DEBUG:teuthology.parallel:result is None 2026-03-08T22:39:39.170 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:39:39.771 DEBUG:teuthology.orchestra.run.vm01:> dpkg-query -W -f '${Version}' ceph 2026-03-08T22:39:39.780 INFO:teuthology.orchestra.run.vm01.stdout:19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:39:39.781 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:39:39.781 INFO:teuthology.task.install:The correct ceph version 19.2.3-678-ge911bdeb-1jammy is installed. 2026-03-08T22:39:39.782 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T22:39:39.782 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:39:39.782 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T22:39:39.834 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T22:39:39.834 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:39:39.834 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T22:39:39.885 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T22:39:39.938 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T22:39:39.938 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:39:39.938 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T22:39:39.988 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T22:39:40.038 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T22:39:40.039 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:39:40.039 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T22:39:40.088 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T22:39:40.139 INFO:teuthology.run_tasks:Running task workunit... 2026-03-08T22:39:40.144 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:39:40.145 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-08T22:39:40.145 INFO:tasks.workunit:timeout=3h 2026-03-08T22:39:40.145 INFO:tasks.workunit:cleanup=True 2026-03-08T22:39:40.145 DEBUG:teuthology.orchestra.run.vm01:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:39:40.184 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:39:40.184 INFO:teuthology.orchestra.run.vm01.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-08T22:39:40.185 DEBUG:teuthology.orchestra.run.vm01:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:39:40.233 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-08T22:39:40.233 DEBUG:teuthology.orchestra.run.vm01:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-08T22:39:40.281 DEBUG:teuthology.orchestra.run.vm01:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:39:40.329 INFO:tasks.workunit.client.0.vm01.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:state without impacting any branches by switching back to a branch. 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: git switch -c 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:Or undo this operation with: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: git switch - 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-08T22:40:38.280 INFO:tasks.workunit.client.0.vm01.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-08T22:40:38.288 DEBUG:teuthology.orchestra.run.vm01:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-08T22:40:38.333 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-08T22:40:38.333 DEBUG:teuthology.orchestra.run.vm01:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-08T22:40:38.380 INFO:tasks.workunit:Running workunits matching mgr on client.0... 2026-03-08T22:40:38.380 INFO:tasks.workunit:Running workunit mgr/balancer.sh... 2026-03-08T22:40:38.380 DEBUG:teuthology.orchestra.run.vm01:workunit test mgr/balancer.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh 2026-03-08T22:40:38.429 INFO:tasks.workunit.client.0.vm01.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/balancer 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:20: run: local dir=td/balancer 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:21: run: shift 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:23: run: export CEPH_MON=127.0.0.1:7102 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:23: run: CEPH_MON=127.0.0.1:7102 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:24: run: export CEPH_ARGS 2026-03-08T22:40:38.433 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:25: run: uuidgen 2026-03-08T22:40:38.434 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:25: run: CEPH_ARGS+='--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none ' 2026-03-08T22:40:38.434 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:26: run: CEPH_ARGS+='--mon-host=127.0.0.1:7102 ' 2026-03-08T22:40:38.434 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:28: run: set 2026-03-08T22:40:38.434 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:28: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:40:38.435 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:28: run: local 'funcs=TEST_balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:TEST_balancer2' 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:29: run: for func in $funcs 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:30: run: TEST_balancer td/balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:38: TEST_balancer: local dir=td/balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:40: TEST_balancer: setup td/balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/balancer 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/balancer KILL 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:40:38.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:40:38.438 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:40:38.438 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:40:38.439 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:40:38.439 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:40:38.440 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:40:38.440 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:40:38.440 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:40:38.441 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:40:38.441 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:40:38.441 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:40:38.441 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:40:38.442 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:40:38.443 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:40:38.443 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/balancer 2026-03-08T22:40:38.444 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:40:38.444 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.444 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.444 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19264 2026-03-08T22:40:38.445 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:40:38.445 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:40:38.445 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/balancer 2026-03-08T22:40:38.446 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:40:38.446 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.446 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.447 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.19264 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/balancer 1' TERM HUP INT 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:41: TEST_balancer: run_mon td/balancer a 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/balancer 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/balancer/a 2026-03-08T22:40:38.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/balancer/a --run-dir=td/balancer 2026-03-08T22:40:38.549 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:40:38.549 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:38.549 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:38.550 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:38.550 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.550 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.550 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:38.550 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/balancer/a '--log-file=td/balancer/$name.log' '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --mon-cluster-log-file=td/balancer/log --run-dir=td/balancer '--pid-file=td/balancer/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:40:38.615 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.617 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19264/ceph-mon.a.asok 2026-03-08T22:40:38.617 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:40:38.617 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19264/ceph-mon.a.asok config get fsid 2026-03-08T22:40:38.682 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:40:38.682 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:40:38.682 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.683 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19264/ceph-mon.a.asok 2026-03-08T22:40:38.684 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:40:38.684 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19264/ceph-mon.a.asok config get mon_host 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:42: TEST_balancer: run_mgr td/balancer x 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/balancer 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/balancer/x 2026-03-08T22:40:38.749 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:38.872 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:40:38.873 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/balancer/x '--log-file=td/balancer/$name.log' '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --run-dir=td/balancer '--pid-file=td/balancer/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:43: TEST_balancer: run_osd td/balancer 0 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/0 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 ' 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/0' 2026-03-08T22:40:38.890 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/0/journal' 2026-03-08T22:40:38.891 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:40:38.891 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:40:38.891 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:40:38.891 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:40:38.891 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:38.891 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:40:38.892 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/0 2026-03-08T22:40:38.893 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:40:38.894 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=6ec7de81-1f82-40db-ada5-848ade44032f 2026-03-08T22:40:38.894 INFO:tasks.workunit.client.0.vm01.stdout:add osd0 6ec7de81-1f82-40db-ada5-848ade44032f 2026-03-08T22:40:38.894 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 6ec7de81-1f82-40db-ada5-848ade44032f' 2026-03-08T22:40:38.894 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:40:38.907 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDm+q1pvAjfNRAAGtbTSHLHNVz+UQbaAPGdkg== 2026-03-08T22:40:38.907 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDm+q1pvAjfNRAAGtbTSHLHNVz+UQbaAPGdkg=="}' 2026-03-08T22:40:38.907 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 6ec7de81-1f82-40db-ada5-848ade44032f -i td/balancer/0/new.json 2026-03-08T22:40:39.060 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:40:39.071 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/0/new.json 2026-03-08T22:40:39.072 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/0 --osd-journal=td/balancer/0/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDm+q1pvAjfNRAAGtbTSHLHNVz+UQbaAPGdkg== --osd-uuid 6ec7de81-1f82-40db-ada5-848ade44032f 2026-03-08T22:40:39.087 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:39.081+0000 7f257dffa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:39.089 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:39.085+0000 7f257dffa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:39.091 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:39.085+0000 7f257dffa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:39.091 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:39.085+0000 7f257dffa8c0 -1 bdev(0x56080c220c00 td/balancer/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:40:39.091 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:39.085+0000 7f257dffa8c0 -1 bluestore(td/balancer/0) _read_fsid unparsable uuid 2026-03-08T22:40:41.520 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/0/keyring 2026-03-08T22:40:41.520 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:40:41.522 INFO:tasks.workunit.client.0.vm01.stdout:adding osd0 key to auth repository 2026-03-08T22:40:41.522 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:40:41.522 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:40:41.647 INFO:tasks.workunit.client.0.vm01.stdout:start osd.0 2026-03-08T22:40:41.647 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:40:41.647 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/0 --osd-journal=td/balancer/0/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:40:41.653 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:40:41.653 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:40:41.653 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:40:41.685 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:41.673+0000 7f49b310d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:41.694 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:41.689+0000 7f49b310d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:41.709 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:41.697+0000 7f49b310d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:41.796 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:40:41.796 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:40:41.796 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:40:41.796 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:40:41.796 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:40:41.797 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:40:41.797 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:41.797 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:40:41.797 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:41.801 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:41.949 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:42.674 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:42.669+0000 7f49b310d8c0 -1 Falling back to public interface 2026-03-08T22:40:42.950 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:40:42.950 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:42.950 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:42.950 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:40:42.950 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:42.950 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:43.173 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:43.642 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:43.637+0000 7f49b310d8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:40:44.174 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:40:44.175 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:44.175 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:44.175 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:40:44.175 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:44.175 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:44.473 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:45.474 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:40:45.474 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:45.474 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:45.474 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:40:45.474 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:45.474 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/569627079,v1:127.0.0.1:6803/569627079] [v2:127.0.0.1:6804/569627079,v1:127.0.0.1:6805/569627079] exists,up 6ec7de81-1f82-40db-ada5-848ade44032f 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:44: TEST_balancer: run_osd td/balancer 1 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/1 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 ' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/1' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/1/journal' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:40:45.698 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:40:45.699 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/1 2026-03-08T22:40:45.700 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:40:45.701 INFO:tasks.workunit.client.0.vm01.stdout:add osd1 e7368a9a-b8f8-4f81-bab3-e350d65517d8 2026-03-08T22:40:45.701 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e7368a9a-b8f8-4f81-bab3-e350d65517d8 2026-03-08T22:40:45.701 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 e7368a9a-b8f8-4f81-bab3-e350d65517d8' 2026-03-08T22:40:45.701 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:40:45.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDt+q1ps8VeKhAArhS38iGl4zBiYutSxgJnSQ== 2026-03-08T22:40:45.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDt+q1ps8VeKhAArhS38iGl4zBiYutSxgJnSQ=="}' 2026-03-08T22:40:45.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e7368a9a-b8f8-4f81-bab3-e350d65517d8 -i td/balancer/1/new.json 2026-03-08T22:40:45.941 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:40:45.954 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/1/new.json 2026-03-08T22:40:45.955 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/1 --osd-journal=td/balancer/1/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDt+q1ps8VeKhAArhS38iGl4zBiYutSxgJnSQ== --osd-uuid e7368a9a-b8f8-4f81-bab3-e350d65517d8 2026-03-08T22:40:45.972 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:45.965+0000 7f85bac818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:45.974 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:45.969+0000 7f85bac818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:45.975 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:45.969+0000 7f85bac818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:45.976 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:45.969+0000 7f85bac818c0 -1 bdev(0x55888092fc00 td/balancer/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:40:45.976 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:45.969+0000 7f85bac818c0 -1 bluestore(td/balancer/1) _read_fsid unparsable uuid 2026-03-08T22:40:48.383 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/1/keyring 2026-03-08T22:40:48.383 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:40:48.383 INFO:tasks.workunit.client.0.vm01.stdout:adding osd1 key to auth repository 2026-03-08T22:40:48.384 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:40:48.384 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:40:48.682 INFO:tasks.workunit.client.0.vm01.stdout:start osd.1 2026-03-08T22:40:48.683 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:40:48.683 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/1 --osd-journal=td/balancer/1/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:40:48.683 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:40:48.684 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:40:48.685 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:40:48.698 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:48.689+0000 7f3ce97da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:48.698 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:48.693+0000 7f3ce97da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:48.699 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:48.693+0000 7f3ce97da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:48.914 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:40:49.124 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:49.658 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:49.653+0000 7f3ce97da8c0 -1 Falling back to public interface 2026-03-08T22:40:50.126 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:40:50.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:50.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:50.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:40:50.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:50.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:40:50.385 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:50.624 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:50.617+0000 7f3ce97da8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:40:51.387 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:40:51.387 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:51.387 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:51.387 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:40:51.387 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:51.387 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:40:51.618 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:51.861 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:51.853+0000 7f3ce4f93640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:40:52.619 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:40:52.620 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:52.620 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:52.620 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:40:52.620 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:52.620 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:40:52.876 INFO:tasks.workunit.client.0.vm01.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3527849159,v1:127.0.0.1:6811/3527849159] [v2:127.0.0.1:6812/3527849159,v1:127.0.0.1:6813/3527849159] exists,up e7368a9a-b8f8-4f81-bab3-e350d65517d8 2026-03-08T22:40:52.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:40:52.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:40:52.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:40:52.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:45: TEST_balancer: run_osd td/balancer 2 2026-03-08T22:40:52.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/2 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 ' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/2' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/2/journal' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:52.877 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:40:52.878 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/2 2026-03-08T22:40:52.879 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:40:52.880 INFO:tasks.workunit.client.0.vm01.stdout:add osd2 7c616b80-42da-43d2-9ac5-e70289307820 2026-03-08T22:40:52.880 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7c616b80-42da-43d2-9ac5-e70289307820 2026-03-08T22:40:52.880 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 7c616b80-42da-43d2-9ac5-e70289307820' 2026-03-08T22:40:52.881 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:40:52.894 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD0+q1pP1w6NRAAkCXplx0J5GywYCLuZsgX3g== 2026-03-08T22:40:52.894 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD0+q1pP1w6NRAAkCXplx0J5GywYCLuZsgX3g=="}' 2026-03-08T22:40:52.894 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7c616b80-42da-43d2-9ac5-e70289307820 -i td/balancer/2/new.json 2026-03-08T22:40:53.120 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:40:53.132 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/2/new.json 2026-03-08T22:40:53.133 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/2 --osd-journal=td/balancer/2/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQD0+q1pP1w6NRAAkCXplx0J5GywYCLuZsgX3g== --osd-uuid 7c616b80-42da-43d2-9ac5-e70289307820 2026-03-08T22:40:53.147 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:53.145+0000 7fa5464f38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:53.149 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:53.145+0000 7fa5464f38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:53.150 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:53.145+0000 7fa5464f38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:53.150 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:53.145+0000 7fa5464f38c0 -1 bdev(0x558fc9d9fc00 td/balancer/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:40:53.150 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:53.145+0000 7fa5464f38c0 -1 bluestore(td/balancer/2) _read_fsid unparsable uuid 2026-03-08T22:40:55.966 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/2/keyring 2026-03-08T22:40:55.966 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:40:55.966 INFO:tasks.workunit.client.0.vm01.stdout:adding osd2 key to auth repository 2026-03-08T22:40:55.966 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:40:55.966 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:40:56.266 INFO:tasks.workunit.client.0.vm01.stdout:start osd.2 2026-03-08T22:40:56.267 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:40:56.267 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/2 --osd-journal=td/balancer/2/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:40:56.267 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:40:56.267 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:40:56.273 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:40:56.284 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:56.277+0000 7f8451d878c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:56.286 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:56.281+0000 7f8451d878c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:56.289 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:56.281+0000 7f8451d878c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:56.486 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:40:56.710 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:56.978 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:56.973+0000 7f8451d878c0 -1 Falling back to public interface 2026-03-08T22:40:57.711 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:40:57.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:57.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:57.712 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:40:57.712 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:57.712 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:40:57.939 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:58.203 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:40:58.197+0000 7f8451d878c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:40:58.940 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:40:58.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:58.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:58.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:40:58.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:58.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:40:59.178 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:00.179 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:41:00.180 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:00.180 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:00.180 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:00.180 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:00.180 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:00.434 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:01.435 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-08T22:41:01.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:01.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:01.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:41:01.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:01.436 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:01.658 INFO:tasks.workunit.client.0.vm01.stdout:osd.2 up in weight 1 up_from 13 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1581112296,v1:127.0.0.1:6819/1581112296] [v2:127.0.0.1:6820/1581112296,v1:127.0.0.1:6821/1581112296] exists,up 7c616b80-42da-43d2-9ac5-e70289307820 2026-03-08T22:41:01.658 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:01.658 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:01.658 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:01.658 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:46: TEST_balancer: create_pool test1 8 2026-03-08T22:41:01.658 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test1 8 2026-03-08T22:41:01.974 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test1' already exists 2026-03-08T22:41:01.988 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:41:02.989 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:47: TEST_balancer: create_pool test2 8 2026-03-08T22:41:02.989 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test2 8 2026-03-08T22:41:03.279 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test2' already exists 2026-03-08T22:41:03.293 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:41:04.294 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:49: TEST_balancer: wait_for_clean 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:41:04.295 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:41:04.353 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:41:04.353 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:41:04.353 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:41:04.353 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:41:04.353 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:41:04.353 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:41:04.572 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:41:04.572 INFO:tasks.workunit.client.0.vm01.stderr:1 2026-03-08T22:41:04.572 INFO:tasks.workunit.client.0.vm01.stderr:2' 2026-03-08T22:41:04.572 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:41:04.572 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:41:04.573 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:41:04.659 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T22:41:04.659 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T22:41:04.659 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T22:41:04.659 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:41:04.660 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:41:04.739 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705669 2026-03-08T22:41:04.739 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705669 2026-03-08T22:41:04.739 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-38654705669' 2026-03-08T22:41:04.739 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:41:04.740 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:41:04.830 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574851 2026-03-08T22:41:04.830 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574851 2026-03-08T22:41:04.830 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-38654705669 2-55834574851' 2026-03-08T22:41:04.830 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:41:04.830 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T22:41:04.830 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:41:04.832 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:41:04.832 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T22:41:04.832 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:41:04.833 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.0 seq 21474836486 2026-03-08T22:41:04.833 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T22:41:04.833 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T22:41:04.833 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:41:05.054 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836486 2026-03-08T22:41:05.054 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:41:06.055 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:41:06.055 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:41:06.284 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836486 2026-03-08T22:41:06.284 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:41:06.284 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705669 2026-03-08T22:41:06.285 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:41:06.285 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:41:06.286 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705669 2026-03-08T22:41:06.286 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:41:06.287 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705669 2026-03-08T22:41:06.287 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.1 seq 38654705669 2026-03-08T22:41:06.287 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705669' 2026-03-08T22:41:06.287 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:41:06.522 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705669 -lt 38654705669 2026-03-08T22:41:06.522 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:41:06.523 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574851 2026-03-08T22:41:06.523 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:41:06.523 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:41:06.524 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574851 2026-03-08T22:41:06.524 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:41:06.525 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574851 2026-03-08T22:41:06.526 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.2 seq 55834574851 2026-03-08T22:41:06.526 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574851' 2026-03-08T22:41:06.526 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:41:06.757 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574851 -lt 55834574851 2026-03-08T22:41:06.757 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:41:06.757 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:41:06.757 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:41:07.047 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 16 == 0 2026-03-08T22:41:07.047 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:41:07.048 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:41:07.048 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:41:07.048 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:41:07.048 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:41:07.048 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:41:07.048 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:41:07.265 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=16 2026-03-08T22:41:07.265 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:41:07.266 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:41:07.266 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:41:07.549 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 16 = 16 2026-03-08T22:41:07.549 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:41:07.549 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:41:07.549 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:51: TEST_balancer: ceph pg dump pgs 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.025946+0000 0'0 19:19 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T03:43:30.751734+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:2.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.346722+0000 0'0 19:13 [1,2,0] 1 [1,2,0] 1 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T00:39:49.278486+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.028842+0000 0'0 19:19 [2,0,1] 2 [2,0,1] 2 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:26:46.731082+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:2.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.347301+0000 0'0 18:10 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T02:23:05.602112+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.027100+0000 0'0 18:16 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:25:32.781000+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:2.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.343790+0000 0'0 19:13 [2,1,0] 2 [2,1,0] 2 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:16:31.311112+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.028311+0000 0'0 19:19 [1,2,0] 1 [1,2,0] 1 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T02:53:34.716586+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.342941+0000 0'0 19:13 [2,1,0] 2 [2,1,0] 2 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:54:00.695053+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:2.7 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.344476+0000 0'0 19:13 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:56:11.503091+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:1.4 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.025225+0000 0'0 19:19 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:42:47.923017+0000 0 0 2026-03-08T22:41:07.769 INFO:tasks.workunit.client.0.vm01.stdout:2.6 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.340722+0000 0'0 19:13 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:39:46.906912+0000 0 0 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout:1.5 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.028145+0000 0'0 19:19 [2,0,1] 2 [2,0,1] 2 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T03:45:34.367419+0000 0 0 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout:2.5 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.345194+0000 0'0 19:13 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T03:55:03.089507+0000 0 0 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout:1.6 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.025843+0000 0'0 19:19 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T10:22:38.442552+0000 0 0 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout:1.7 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:02.026724+0000 0'0 19:19 [1,2,0] 1 [1,2,0] 1 0'0 2026-03-08T22:41:01.889997+0000 0'0 2026-03-08T22:41:01.889997+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:21:27.145861+0000 0 0 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout:2.4 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:41:03.344238+0000 0'0 19:13 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:41:03.218563+0000 0'0 2026-03-08T22:41:03.218563+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:36:28.737330+0000 0 0 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:41:07.770 INFO:tasks.workunit.client.0.vm01.stderr:dumped pgs 2026-03-08T22:41:07.781 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:52: TEST_balancer: ceph balancer status 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "active": true, 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000179", 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:40:41 2026", 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "upmap", 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": false, 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "No pools available", 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:07.993 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:08.006 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:53: TEST_balancer: ceph balancer status 2026-03-08T22:41:08.006 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:53: TEST_balancer: jq .mode 2026-03-08T22:41:08.229 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:53: TEST_balancer: eval 'MODE="upmap"' 2026-03-08T22:41:08.229 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:53: TEST_balancer: MODE=upmap 2026-03-08T22:41:08.229 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:54: TEST_balancer: test upmap = upmap 2026-03-08T22:41:08.230 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:55: TEST_balancer: ceph balancer status 2026-03-08T22:41:08.230 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:55: TEST_balancer: jq .active 2026-03-08T22:41:08.447 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:55: TEST_balancer: ACTIVE=true 2026-03-08T22:41:08.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:56: TEST_balancer: test true = true 2026-03-08T22:41:08.448 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:58: TEST_balancer: ceph balancer ls 2026-03-08T22:41:08.665 INFO:tasks.workunit.client.0.vm01.stdout:[] 2026-03-08T22:41:08.678 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:59: TEST_balancer: ceph balancer ls 2026-03-08T22:41:08.905 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:59: TEST_balancer: PLANS='[]' 2026-03-08T22:41:08.905 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:60: TEST_balancer: test '[]' = '[]' 2026-03-08T22:41:08.905 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:61: TEST_balancer: ceph balancer eval 2026-03-08T22:41:09.114 INFO:tasks.workunit.client.0.vm01.stdout:current cluster score 0.000000 (lower is better) 2026-03-08T22:41:09.114 INFO:tasks.workunit.client.0.vm01.stdout:read_balance_scores (lower is better) {'test1': 1.8799999952316284, 'test2': 1.8799999952316284} 2026-03-08T22:41:09.127 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:62: TEST_balancer: ceph balancer eval 2026-03-08T22:41:09.356 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:62: TEST_balancer: EVAL='current cluster score 0.000000 (lower is better) 2026-03-08T22:41:09.356 INFO:tasks.workunit.client.0.vm01.stderr:read_balance_scores (lower is better) {'\''test1'\'': 1.8799999952316284, '\''test2'\'': 1.8799999952316284}' 2026-03-08T22:41:09.356 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:63: TEST_balancer: test 'current cluster score 0.000000 (lower is better) 2026-03-08T22:41:09.356 INFO:tasks.workunit.client.0.vm01.stderr:read_balance_scores (lower is better) {'\''test1'\'': 1.8799999952316284, '\''test2'\'': 1.8799999952316284}' = 'current cluster score 0.000000 (lower is better)' 2026-03-08T22:41:09.356 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:64: TEST_balancer: ceph balancer eval-verbose 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:current cluster 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:target_by_root {'default': {0: 0.3333333333333333, 1: 0.3333333333333333, 2: 0.3333333333333333}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:actual_by_pool {'test1': {'pgs': {1: 0.3333333333333333, 0: 0.3333333333333333, 2: 0.3333333333333333}, 'objects': {1: 0.0, 0: 0.0, 2: 0.0}, 'bytes': {1: 0.0, 0: 0.0, 2: 0.0}}, 'test2': {'pgs': {2: 0.3333333333333333, 1: 0.3333333333333333, 0: 0.3333333333333333}, 'objects': {2: 0.0, 1: 0.0, 0: 0.0}, 'bytes': {2: 0.0, 1: 0.0, 0: 0.0}}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:actual_by_root {'default': {'pgs': {0: 0.3333333333333333, 1: 0.3333333333333333, 2: 0.3333333333333333}, 'objects': {0: 0.0, 1: 0.0, 2: 0.0}, 'bytes': {0: 0.0, 1: 0.0, 2: 0.0}}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:count_by_pool {'test1': {'pgs': {1: 8, 0: 8, 2: 8}, 'objects': {1: 0, 0: 0, 2: 0}, 'bytes': {1: 0, 0: 0, 2: 0}}, 'test2': {'pgs': {2: 8, 1: 8, 0: 8}, 'objects': {2: 0, 1: 0, 0: 0}, 'bytes': {2: 0, 1: 0, 0: 0}}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:count_by_root {'default': {'pgs': {0: 16.0, 1: 16.0, 2: 16.0}, 'objects': {0: 0.0, 1: 0.0, 2: 0.0}, 'bytes': {0: 0.0, 1: 0.0, 2: 0.0}}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:total_by_pool {'test1': {'pgs': 24, 'objects': 0, 'bytes': 0}, 'test2': {'pgs': 24, 'objects': 0, 'bytes': 0}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:total_by_root {'default': {'pgs': 48, 'objects': 0, 'bytes': 0}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:stats_by_root {'default': {'pgs': {'max': 16.0, 'min': 16.0, 'avg': 16.0, 'stddev': 0.0, 'sum_weight': 0.0, 'score': 0.0}, 'objects': {'max': 0, 'min': 0, 'avg': 0, 'stddev': 0, 'sum_weight': 0, 'score': 0}, 'bytes': {'max': 0, 'min': 0, 'avg': 0, 'stddev': 0, 'sum_weight': 0, 'score': 0}}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:score_by_pool {} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:score_by_root {'default': {'pgs': 0.0, 'objects': 0, 'bytes': 0}} 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:score 0.000000 (lower is better) 2026-03-08T22:41:09.569 INFO:tasks.workunit.client.0.vm01.stdout:read_balance_score_by_pool {'test1': {'score_type': 'Fair distribution', 'score_acting': 1.8799999952316284, 'score_stable': 1.8799999952316284, 'optimal_score': 1.0, 'raw_score_acting': 1.8799999952316284, 'raw_score_stable': 1.8799999952316284, 'primary_affinity_weighted': 1.0, 'average_primary_affinity': 1.0, 'average_primary_affinity_weighted': 1.0}, 'test2': {'score_type': 'Fair distribution', 'score_acting': 1.8799999952316284, 'score_stable': 1.8799999952316284, 'optimal_score': 1.0, 'raw_score_acting': 1.8799999952316284, 'raw_score_stable': 1.8799999952316284, 'primary_affinity_weighted': 1.0, 'average_primary_affinity': 1.0, 'average_primary_affinity_weighted': 1.0}} 2026-03-08T22:41:09.581 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:66: TEST_balancer: ceph balancer pool add test1 2026-03-08T22:41:09.811 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:67: TEST_balancer: ceph balancer pool add test2 2026-03-08T22:41:10.055 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:68: TEST_balancer: ceph balancer pool ls 2026-03-08T22:41:10.273 INFO:tasks.workunit.client.0.vm01.stdout:[ 2026-03-08T22:41:10.273 INFO:tasks.workunit.client.0.vm01.stdout: "test1", 2026-03-08T22:41:10.273 INFO:tasks.workunit.client.0.vm01.stdout: "test2" 2026-03-08T22:41:10.273 INFO:tasks.workunit.client.0.vm01.stdout:] 2026-03-08T22:41:10.287 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:69: TEST_balancer: ceph balancer pool ls 2026-03-08T22:41:10.287 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:69: TEST_balancer: jq 'sort | .[0]' 2026-03-08T22:41:10.541 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:69: TEST_balancer: eval 'POOL="test1"' 2026-03-08T22:41:10.542 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:69: TEST_balancer: POOL=test1 2026-03-08T22:41:10.542 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:70: TEST_balancer: test test1 = test1 2026-03-08T22:41:10.542 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:71: TEST_balancer: ceph balancer pool ls 2026-03-08T22:41:10.542 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:71: TEST_balancer: jq 'sort | .[1]' 2026-03-08T22:41:10.814 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:71: TEST_balancer: eval 'POOL="test2"' 2026-03-08T22:41:10.814 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:71: TEST_balancer: POOL=test2 2026-03-08T22:41:10.814 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:72: TEST_balancer: test test2 = test2 2026-03-08T22:41:10.814 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:73: TEST_balancer: ceph balancer pool rm test1 2026-03-08T22:41:11.644 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:74: TEST_balancer: ceph balancer pool rm test2 2026-03-08T22:41:11.873 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:75: TEST_balancer: ceph balancer pool ls 2026-03-08T22:41:12.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:76: TEST_balancer: ceph balancer pool add test1 2026-03-08T22:41:12.314 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:78: TEST_balancer: ceph balancer mode crush-compat 2026-03-08T22:41:12.583 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:79: TEST_balancer: ceph balancer status 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "active": true, 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000179", 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:40:41 2026", 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "crush-compat", 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": false, 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "No pools available", 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:12.799 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:12.817 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:80: TEST_balancer: ceph balancer status 2026-03-08T22:41:12.818 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:80: TEST_balancer: jq .mode 2026-03-08T22:41:13.060 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:80: TEST_balancer: eval 'MODE="crush-compat"' 2026-03-08T22:41:13.060 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:80: TEST_balancer: MODE=crush-compat 2026-03-08T22:41:13.060 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:81: TEST_balancer: test crush-compat = crush-compat 2026-03-08T22:41:13.060 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:82: TEST_balancer: ceph balancer off 2026-03-08T22:41:13.300 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:83: TEST_balancer: ceph balancer optimize plan_crush test1 2026-03-08T22:41:13.464 INFO:tasks.workunit.client.0.vm01.stderr:Error EALREADY: Distribution is already perfect 2026-03-08T22:41:13.468 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:84: TEST_balancer: ceph balancer status 2026-03-08T22:41:13.677 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:13.677 INFO:tasks.workunit.client.0.vm01.stdout: "active": false, 2026-03-08T22:41:13.677 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000666", 2026-03-08T22:41:13.677 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:41:13 2026", 2026-03-08T22:41:13.677 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "crush-compat", 2026-03-08T22:41:13.678 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": false, 2026-03-08T22:41:13.678 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "Distribution is already perfect", 2026-03-08T22:41:13.678 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:13.678 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:13.691 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:85: TEST_balancer: ceph balancer status 2026-03-08T22:41:13.691 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:85: TEST_balancer: jq .optimize_result 2026-03-08T22:41:13.934 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:85: TEST_balancer: eval 'RESULT="Distribution' is already 'perfect"' 2026-03-08T22:41:13.934 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:85: TEST_balancer: RESULT='Distribution is already perfect' 2026-03-08T22:41:13.934 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:86: TEST_balancer: test 'Distribution is already perfect' = 'Distribution is already perfect' 2026-03-08T22:41:13.934 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:88: TEST_balancer: ceph balancer on 2026-03-08T22:41:14.171 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:89: TEST_balancer: ceph balancer status 2026-03-08T22:41:14.171 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:89: TEST_balancer: jq .active 2026-03-08T22:41:14.391 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:89: TEST_balancer: ACTIVE=true 2026-03-08T22:41:14.391 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:90: TEST_balancer: test true = true 2026-03-08T22:41:14.391 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:91: TEST_balancer: sleep 2 2026-03-08T22:41:16.392 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:92: TEST_balancer: ceph balancer status 2026-03-08T22:41:16.618 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:16.618 INFO:tasks.workunit.client.0.vm01.stdout: "active": true, 2026-03-08T22:41:16.618 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000382", 2026-03-08T22:41:16.618 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:41:14 2026", 2026-03-08T22:41:16.618 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "crush-compat", 2026-03-08T22:41:16.618 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": false, 2026-03-08T22:41:16.619 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "Distribution is already perfect", 2026-03-08T22:41:16.619 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:16.619 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:16.631 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:93: TEST_balancer: ceph balancer off 2026-03-08T22:41:16.869 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:94: TEST_balancer: ceph balancer status 2026-03-08T22:41:16.869 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:94: TEST_balancer: jq .active 2026-03-08T22:41:17.093 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:94: TEST_balancer: ACTIVE=false 2026-03-08T22:41:17.093 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:95: TEST_balancer: test false = false 2026-03-08T22:41:17.093 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:96: TEST_balancer: sleep 2 2026-03-08T22:41:19.094 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:98: TEST_balancer: ceph balancer reset 2026-03-08T22:41:19.320 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:100: TEST_balancer: ceph balancer mode upmap 2026-03-08T22:41:19.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:101: TEST_balancer: ceph balancer status 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "active": false, 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000382", 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:41:14 2026", 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "upmap", 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": false, 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "Distribution is already perfect", 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:19.777 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:19.790 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:102: TEST_balancer: ceph balancer status 2026-03-08T22:41:19.790 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:102: TEST_balancer: jq .mode 2026-03-08T22:41:20.008 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:102: TEST_balancer: eval 'MODE="upmap"' 2026-03-08T22:41:20.008 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:102: TEST_balancer: MODE=upmap 2026-03-08T22:41:20.008 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:103: TEST_balancer: test upmap = upmap 2026-03-08T22:41:20.008 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:104: TEST_balancer: ceph balancer optimize plan_upmap 2026-03-08T22:41:20.190 INFO:tasks.workunit.client.0.vm01.stderr:Error EALREADY: Unable to find further optimization, or pool(s) pg_num is decreasing, or distribution is already perfect 2026-03-08T22:41:20.193 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:105: TEST_balancer: ceph balancer status 2026-03-08T22:41:20.428 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "active": false, 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000221", 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:41:20 2026", 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "upmap", 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": true, 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "Unable to find further optimization, or pool(s) pg_num is decreasing, or distribution is already perfect", 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:20.439 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:20.440 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:106: TEST_balancer: ceph balancer status 2026-03-08T22:41:20.440 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:106: TEST_balancer: jq .optimize_result 2026-03-08T22:41:20.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:106: TEST_balancer: eval 'RESULT="Unable' to find further optimization, or 'pool(s)' pg_num is decreasing, or distribution is already 'perfect"' 2026-03-08T22:41:20.703 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:106: TEST_balancer: RESULT='Unable to find further optimization, or pool(s) pg_num is decreasing, or distribution is already perfect' 2026-03-08T22:41:20.703 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:107: TEST_balancer: test 'Unable to find further optimization, or pool(s) pg_num is decreasing, or distribution is already perfect' = 'Unable to find further optimization, or pool(s) pg_num is decreasing, or distribution is already perfect' 2026-03-08T22:41:20.703 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:109: TEST_balancer: ceph balancer on 2026-03-08T22:41:21.079 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:110: TEST_balancer: ceph balancer status 2026-03-08T22:41:21.079 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:110: TEST_balancer: jq .active 2026-03-08T22:41:21.300 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:110: TEST_balancer: ACTIVE=true 2026-03-08T22:41:21.300 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:111: TEST_balancer: test true = true 2026-03-08T22:41:21.300 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:112: TEST_balancer: sleep 2 2026-03-08T22:41:23.301 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:113: TEST_balancer: ceph balancer status 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "active": true, 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_duration": "0:00:00.000142", 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "last_optimize_started": "Sun Mar 8 22:41:21 2026", 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "mode": "upmap", 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "no_optimization_needed": true, 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "optimize_result": "Unable to find further optimization, or pool(s) pg_num is decreasing, or distribution is already perfect", 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout: "plans": [] 2026-03-08T22:41:23.507 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-08T22:41:23.518 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:114: TEST_balancer: ceph balancer off 2026-03-08T22:41:23.747 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:115: TEST_balancer: ceph balancer status 2026-03-08T22:41:23.747 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:115: TEST_balancer: jq .active 2026-03-08T22:41:23.970 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:115: TEST_balancer: ACTIVE=false 2026-03-08T22:41:23.970 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:116: TEST_balancer: test false = false 2026-03-08T22:41:23.970 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:118: TEST_balancer: teardown td/balancer 2026-03-08T22:41:23.970 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/balancer 2026-03-08T22:41:23.970 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:23.970 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/balancer KILL 2026-03-08T22:41:23.971 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:23.971 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:23.971 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:23.971 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:23.971 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:24.087 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:24.087 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:24.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:41:24.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:24.089 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:24.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:24.090 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:24.090 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.091 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:24.091 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:24.091 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/balancer 2026-03-08T22:41:24.105 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:24.105 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.105 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.105 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19264 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:29: run: for func in $funcs 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:30: run: TEST_balancer2 td/balancer 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:122: TEST_balancer2: local dir=td/balancer 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:123: TEST_balancer2: TEST_PGS1=118 2026-03-08T22:41:24.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:124: TEST_balancer2: TEST_PGS2=132 2026-03-08T22:41:24.107 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:125: TEST_balancer2: expr 118 + 132 2026-03-08T22:41:24.107 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:125: TEST_balancer2: TOTAL_PGS=250 2026-03-08T22:41:24.107 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:126: TEST_balancer2: OSDS=5 2026-03-08T22:41:24.107 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:127: TEST_balancer2: DEFAULT_REPLICAS=3 2026-03-08T22:41:24.107 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:129: TEST_balancer2: expr '(' 118 '*' 3 ')' / 5 2026-03-08T22:41:24.108 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:129: TEST_balancer2: FINAL_PER_OSD1=70 2026-03-08T22:41:24.108 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:131: TEST_balancer2: expr '(' '(' 118 + 132 ')' '*' 3 ')' / 5 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:131: TEST_balancer2: FINAL_PER_OSD2=150 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:133: TEST_balancer2: CEPH_ARGS+='--osd_pool_default_pg_autoscale_mode=off ' 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:134: TEST_balancer2: CEPH_ARGS+='--debug_osd=20 ' 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:135: TEST_balancer2: setup td/balancer 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/balancer 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/balancer 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/balancer 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/balancer KILL 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:24.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:24.112 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:24.112 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:24.113 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:24.113 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:24.114 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:41:24.114 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:24.114 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:24.115 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.115 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:24.116 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:24.116 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.116 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:24.118 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:24.118 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/balancer 2026-03-08T22:41:24.118 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:24.118 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.118 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.119 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19264 2026-03-08T22:41:24.119 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:24.119 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:24.119 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/balancer 2026-03-08T22:41:24.120 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:24.120 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.120 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.121 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.19264 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/balancer 1' TERM HUP INT 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:136: TEST_balancer2: run_mon td/balancer a 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/balancer 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/balancer/a 2026-03-08T22:41:24.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/balancer/a --run-dir=td/balancer 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:24.149 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/balancer/a '--log-file=td/balancer/$name.log' '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --mon-cluster-log-file=td/balancer/log --run-dir=td/balancer '--pid-file=td/balancer/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:41:24.179 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:24.179 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:24.179 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:24.179 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:24.179 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:24.179 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:24.180 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:24.180 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:24.181 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:24.181 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:24.181 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.182 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.182 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19264/ceph-mon.a.asok 2026-03-08T22:41:24.182 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:24.182 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19264/ceph-mon.a.asok config get fsid 2026-03-08T22:41:24.256 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:24.256 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:24.256 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.257 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19264/ceph-mon.a.asok 2026-03-08T22:41:24.258 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:24.258 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19264/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:24.329 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:137: TEST_balancer2: run_mgr td/balancer x 2026-03-08T22:41:24.330 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/balancer 2026-03-08T22:41:24.330 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:24.330 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:24.330 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:24.330 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/balancer/x 2026-03-08T22:41:24.330 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.465 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:24.466 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:24.467 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/balancer/x '--log-file=td/balancer/$name.log' '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --run-dir=td/balancer '--pid-file=td/balancer/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:24.487 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: expr 5 - 1 2026-03-08T22:41:24.494 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: seq 0 4 2026-03-08T22:41:24.494 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: for i in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:41:24.494 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:140: TEST_balancer2: run_osd td/balancer 0 2026-03-08T22:41:24.494 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:41:24.494 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/0 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 ' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/0' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/0/journal' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:41:24.495 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/0 2026-03-08T22:41:24.496 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:24.497 INFO:tasks.workunit.client.0.vm01.stdout:add osd0 c3d8cc8f-d32c-472b-8d65-ab11766ccf62 2026-03-08T22:41:24.497 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c3d8cc8f-d32c-472b-8d65-ab11766ccf62 2026-03-08T22:41:24.497 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 c3d8cc8f-d32c-472b-8d65-ab11766ccf62' 2026-03-08T22:41:24.497 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:24.509 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAU+61pC9pHHhAAbk1h/41jJn9rHMPXIpTBBQ== 2026-03-08T22:41:24.510 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAU+61pC9pHHhAAbk1h/41jJn9rHMPXIpTBBQ=="}' 2026-03-08T22:41:24.510 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c3d8cc8f-d32c-472b-8d65-ab11766ccf62 -i td/balancer/0/new.json 2026-03-08T22:41:24.646 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:41:24.660 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/0/new.json 2026-03-08T22:41:24.660 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/0 --osd-journal=td/balancer/0/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAU+61pC9pHHhAAbk1h/41jJn9rHMPXIpTBBQ== --osd-uuid c3d8cc8f-d32c-472b-8d65-ab11766ccf62 2026-03-08T22:41:24.677 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:24.669+0000 7f14d6f5e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:24.683 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:24.677+0000 7f14d6f5e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:24.684 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:24.677+0000 7f14d6f5e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:24.684 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:24.677+0000 7f14d6f5e8c0 -1 bdev(0x5635071c3c00 td/balancer/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:24.684 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:24.677+0000 7f14d6f5e8c0 -1 bluestore(td/balancer/0) _read_fsid unparsable uuid 2026-03-08T22:41:26.934 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/0/keyring 2026-03-08T22:41:26.934 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:26.935 INFO:tasks.workunit.client.0.vm01.stdout:adding osd0 key to auth repository 2026-03-08T22:41:26.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:26.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:27.076 INFO:tasks.workunit.client.0.vm01.stdout:start osd.0 2026-03-08T22:41:27.076 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:27.076 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/0 --osd-journal=td/balancer/0/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:27.081 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:27.085 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:27.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:27.116 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:27.109+0000 7f993a9278c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:27.122 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:27.117+0000 7f993a9278c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:27.123 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:27.117+0000 7f993a9278c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:27.310 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:27.558 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:27.822 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:27.817+0000 7f993a9278c0 -1 Falling back to public interface 2026-03-08T22:41:28.559 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:41:28.559 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:28.559 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:28.559 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:28.559 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:28.559 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:28.781 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:29.037 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:29.029+0000 7f993a9278c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:29.782 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:41:29.782 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:29.782 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:29.782 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:29.782 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:29.782 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:30.087 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:31.088 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:41:31.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:31.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:31.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:31.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:31.089 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:31.339 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:32.341 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-08T22:41:32.341 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:32.341 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:32.341 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:41:32.341 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:32.341 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:32.563 INFO:tasks.workunit.client.0.vm01.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/194700013,v1:127.0.0.1:6803/194700013] [v2:127.0.0.1:6804/194700013,v1:127.0.0.1:6805/194700013] exists,up c3d8cc8f-d32c-472b-8d65-ab11766ccf62 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: for i in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:140: TEST_balancer2: run_osd td/balancer 1 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/1 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 ' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/1' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/1/journal' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.564 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:41:32.565 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/1 2026-03-08T22:41:32.566 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:32.567 INFO:tasks.workunit.client.0.vm01.stdout:add osd1 58ea69a6-736c-42e8-9066-7d0835ec7abc 2026-03-08T22:41:32.567 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=58ea69a6-736c-42e8-9066-7d0835ec7abc 2026-03-08T22:41:32.567 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 58ea69a6-736c-42e8-9066-7d0835ec7abc' 2026-03-08T22:41:32.567 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:32.580 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAc+61pSF2IIhAAI5tFZOfgF3zRukQgm2Yzug== 2026-03-08T22:41:32.580 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAc+61pSF2IIhAAI5tFZOfgF3zRukQgm2Yzug=="}' 2026-03-08T22:41:32.580 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 58ea69a6-736c-42e8-9066-7d0835ec7abc -i td/balancer/1/new.json 2026-03-08T22:41:32.803 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:41:32.816 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/1/new.json 2026-03-08T22:41:32.817 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/1 --osd-journal=td/balancer/1/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAc+61pSF2IIhAAI5tFZOfgF3zRukQgm2Yzug== --osd-uuid 58ea69a6-736c-42e8-9066-7d0835ec7abc 2026-03-08T22:41:32.833 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:32.825+0000 7fd1d18f88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:32.835 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:32.829+0000 7fd1d18f88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:32.836 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:32.829+0000 7fd1d18f88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:32.836 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:32.829+0000 7fd1d18f88c0 -1 bdev(0x55db77bedc00 td/balancer/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:32.836 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:32.829+0000 7fd1d18f88c0 -1 bluestore(td/balancer/1) _read_fsid unparsable uuid 2026-03-08T22:41:35.183 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/1/keyring 2026-03-08T22:41:35.183 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:35.184 INFO:tasks.workunit.client.0.vm01.stdout:adding osd1 key to auth repository 2026-03-08T22:41:35.184 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:41:35.184 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:35.478 INFO:tasks.workunit.client.0.vm01.stdout:start osd.1 2026-03-08T22:41:35.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:41:35.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/1 --osd-journal=td/balancer/1/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:35.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:35.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:35.488 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:35.495 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:35.489+0000 7f16e00828c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:35.496 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:35.489+0000 7f16e00828c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:35.501 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:35.493+0000 7f16e00828c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:35.710 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:35.711 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:35.934 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:35.958 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:35.953+0000 7f16e00828c0 -1 Falling back to public interface 2026-03-08T22:41:36.935 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:41:36.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:36.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:36.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:36.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:36.935 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:37.175 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:37.206 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:37.201+0000 7f16e00828c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:41:38.177 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:41:38.177 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:38.177 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:38.177 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:38.177 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:38.177 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:38.424 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/216548293,v1:127.0.0.1:6811/216548293] [v2:127.0.0.1:6812/216548293,v1:127.0.0.1:6813/216548293] exists,up 58ea69a6-736c-42e8-9066-7d0835ec7abc 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: for i in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:140: TEST_balancer2: run_osd td/balancer 2 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/2 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 ' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/2' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/2/journal' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:39.655 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:39.656 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:39.657 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:39.657 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:39.657 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:39.657 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:39.657 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/2 2026-03-08T22:41:39.658 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:39.659 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e15a888f-5b42-4a9c-99df-5d38bf539fde 2026-03-08T22:41:39.659 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 e15a888f-5b42-4a9c-99df-5d38bf539fde' 2026-03-08T22:41:39.659 INFO:tasks.workunit.client.0.vm01.stdout:add osd2 e15a888f-5b42-4a9c-99df-5d38bf539fde 2026-03-08T22:41:39.659 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:39.672 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAj+61pn1/6JxAA7x67LMYGQRNCMqrJSe5m/w== 2026-03-08T22:41:39.672 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAj+61pn1/6JxAA7x67LMYGQRNCMqrJSe5m/w=="}' 2026-03-08T22:41:39.672 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e15a888f-5b42-4a9c-99df-5d38bf539fde -i td/balancer/2/new.json 2026-03-08T22:41:39.898 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:41:39.910 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/2/new.json 2026-03-08T22:41:39.911 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/2 --osd-journal=td/balancer/2/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAj+61pn1/6JxAA7x67LMYGQRNCMqrJSe5m/w== --osd-uuid e15a888f-5b42-4a9c-99df-5d38bf539fde 2026-03-08T22:41:39.928 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:39.921+0000 7f61fee5f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.930 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:39.925+0000 7f61fee5f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.931 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:39.925+0000 7f61fee5f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.932 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:39.925+0000 7f61fee5f8c0 -1 bdev(0x55ec2ae55c00 td/balancer/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:39.932 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:39.925+0000 7f61fee5f8c0 -1 bluestore(td/balancer/2) _read_fsid unparsable uuid 2026-03-08T22:41:42.214 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/2/keyring 2026-03-08T22:41:42.214 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:42.215 INFO:tasks.workunit.client.0.vm01.stdout:adding osd2 key to auth repository 2026-03-08T22:41:42.215 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:41:42.215 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:42.515 INFO:tasks.workunit.client.0.vm01.stdout:start osd.2 2026-03-08T22:41:42.515 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:41:42.515 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/2 --osd-journal=td/balancer/2/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:42.515 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:42.516 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:42.521 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:42.532 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:42.525+0000 7fa4e13f28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:42.533 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:42.529+0000 7fa4e13f28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:42.535 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:42.529+0000 7fa4e13f28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:42.941 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:43.208 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:43.742 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:43.737+0000 7fa4e13f28c0 -1 Falling back to public interface 2026-03-08T22:41:44.209 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:41:44.209 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:44.209 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:44.209 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:44.209 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:44.209 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:44.449 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:44.745 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:44.741+0000 7fa4e13f28c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:41:45.450 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:45.450 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:45.450 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:41:45.450 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:45.451 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:45.451 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:45.700 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:46.149 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:46.145+0000 7fa4dcbab640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:41:46.702 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:41:46.702 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:46.702 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:46.702 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:46.702 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:46.702 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1236841711,v1:127.0.0.1:6819/1236841711] [v2:127.0.0.1:6820/1236841711,v1:127.0.0.1:6821/1236841711] exists,up e15a888f-5b42-4a9c-99df-5d38bf539fde 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: for i in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:140: TEST_balancer2: run_osd td/balancer 3 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/3 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 ' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/3' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/3/journal' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:46.936 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:46.937 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/3 2026-03-08T22:41:46.938 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:46.939 INFO:tasks.workunit.client.0.vm01.stdout:add osd3 18c9664a-e035-4dfa-94f3-34444fe9924e 2026-03-08T22:41:46.939 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=18c9664a-e035-4dfa-94f3-34444fe9924e 2026-03-08T22:41:46.939 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 18c9664a-e035-4dfa-94f3-34444fe9924e' 2026-03-08T22:41:46.939 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:46.951 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAq+61p5AWbOBAA5VmZjfowCNmrYYQaLtNZMg== 2026-03-08T22:41:46.951 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAq+61p5AWbOBAA5VmZjfowCNmrYYQaLtNZMg=="}' 2026-03-08T22:41:46.951 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 18c9664a-e035-4dfa-94f3-34444fe9924e -i td/balancer/3/new.json 2026-03-08T22:41:47.174 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:41:47.186 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/3/new.json 2026-03-08T22:41:47.187 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/3 --osd-journal=td/balancer/3/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAq+61p5AWbOBAA5VmZjfowCNmrYYQaLtNZMg== --osd-uuid 18c9664a-e035-4dfa-94f3-34444fe9924e 2026-03-08T22:41:47.204 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:47.197+0000 7f0013f268c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:47.206 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:47.201+0000 7f0013f268c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:47.207 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:47.201+0000 7f0013f268c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:47.207 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:47.201+0000 7f0013f268c0 -1 bdev(0x55d708b19c00 td/balancer/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:47.207 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:47.201+0000 7f0013f268c0 -1 bluestore(td/balancer/3) _read_fsid unparsable uuid 2026-03-08T22:41:49.466 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/3/keyring 2026-03-08T22:41:49.466 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:49.467 INFO:tasks.workunit.client.0.vm01.stdout:adding osd3 key to auth repository 2026-03-08T22:41:49.467 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:41:49.467 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:49.767 INFO:tasks.workunit.client.0.vm01.stdout:start osd.3 2026-03-08T22:41:49.767 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:41:49.767 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/3 --osd-journal=td/balancer/3/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:49.767 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:49.768 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:49.777 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:49.783 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:49.777+0000 7fc2c6eb88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:49.783 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:49.777+0000 7fc2c6eb88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:49.785 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:49.777+0000 7fc2c6eb88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:50.015 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:41:50.253 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:50.746 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:50.741+0000 7fc2c6eb88c0 -1 Falling back to public interface 2026-03-08T22:41:51.254 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:41:51.254 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:51.254 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:51.254 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:51.254 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:51.254 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:41:51.477 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:51.734 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:51.729+0000 7fc2c6eb88c0 -1 osd.3 0 log_to_monitors true 2026-03-08T22:41:52.478 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:41:52.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:52.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:52.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:52.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:52.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:41:52.712 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:53.713 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:41:53.714 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:53.714 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:53.714 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:53.714 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:53.714 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:41:54.094 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:55.095 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-08T22:41:55.095 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:55.095 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:55.095 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:41:55.095 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:55.095 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:41:55.331 INFO:tasks.workunit.client.0.vm01.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3400997999,v1:127.0.0.1:6827/3400997999] [v2:127.0.0.1:6828/3400997999,v1:127.0.0.1:6829/3400997999] exists,up 18c9664a-e035-4dfa-94f3-34444fe9924e 2026-03-08T22:41:55.331 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:138: TEST_balancer2: for i in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:140: TEST_balancer2: run_osd td/balancer 4 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/balancer 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/balancer/4 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 ' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/balancer/4' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/balancer/4/journal' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/balancer' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:41:55.332 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/balancer/$name.log' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/balancer/$name.pid' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:55.333 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/balancer/4 2026-03-08T22:41:55.334 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:55.334 INFO:tasks.workunit.client.0.vm01.stdout:add osd4 3a85e5d3-cd37-4a5c-af17-b92c2ad31bc7 2026-03-08T22:41:55.334 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3a85e5d3-cd37-4a5c-af17-b92c2ad31bc7 2026-03-08T22:41:55.334 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 3a85e5d3-cd37-4a5c-af17-b92c2ad31bc7' 2026-03-08T22:41:55.335 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:55.346 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAz+61pHh2RFBAAZvpeJh8VVGE3wlzsd/SqQw== 2026-03-08T22:41:55.346 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAz+61pHh2RFBAAZvpeJh8VVGE3wlzsd/SqQw=="}' 2026-03-08T22:41:55.346 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3a85e5d3-cd37-4a5c-af17-b92c2ad31bc7 -i td/balancer/4/new.json 2026-03-08T22:41:55.572 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-08T22:41:55.585 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/balancer/4/new.json 2026-03-08T22:41:55.618 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/4 --osd-journal=td/balancer/4/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAz+61pHh2RFBAAZvpeJh8VVGE3wlzsd/SqQw== --osd-uuid 3a85e5d3-cd37-4a5c-af17-b92c2ad31bc7 2026-03-08T22:41:55.619 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:55.597+0000 7f3ae92e38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:55.619 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:55.597+0000 7f3ae92e38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:55.619 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:55.601+0000 7f3ae92e38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:55.619 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:55.601+0000 7f3ae92e38c0 -1 bdev(0x5585fb04dc00 td/balancer/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:55.619 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:55.601+0000 7f3ae92e38c0 -1 bluestore(td/balancer/4) _read_fsid unparsable uuid 2026-03-08T22:41:57.886 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/balancer/4/keyring 2026-03-08T22:41:57.887 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:57.887 INFO:tasks.workunit.client.0.vm01.stdout:adding osd4 key to auth repository 2026-03-08T22:41:57.887 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:41:57.887 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/balancer/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:58.169 INFO:tasks.workunit.client.0.vm01.stdout:start osd.4 2026-03-08T22:41:58.169 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:41:58.169 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:58.169 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:58.170 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:58.172 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=d3ee804c-8f63-4aa0-a598-9124427fbf35 --auth-supported=none --mon-host=127.0.0.1:7102 --osd_pool_default_pg_autoscale_mode=off --debug_osd=20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/balancer/4 --osd-journal=td/balancer/4/journal --chdir= --run-dir=td/balancer '--admin-socket=/tmp/ceph-asok.19264/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/balancer/$name.log' '--pid-file=td/balancer/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:58.205 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:58.189+0000 7fa2ed4488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:58.205 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:58.201+0000 7fa2ed4488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:58.207 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:58.201+0000 7fa2ed4488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:58.380 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:41:58.595 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:58.666 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:41:58.661+0000 7fa2ed4488c0 -1 Falling back to public interface 2026-03-08T22:41:59.596 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:59.596 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:59.596 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-08T22:41:59.596 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:59.597 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:59.597 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:41:59.811 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:00.177 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-08T22:42:00.169+0000 7fa2ed4488c0 -1 osd.4 0 log_to_monitors true 2026-03-08T22:42:00.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:00.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:00.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:00.812 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-08T22:42:00.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:00.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:42:01.061 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:02.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:02.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:02.063 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-08T22:42:02.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:02.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:02.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:42:02.304 INFO:tasks.workunit.client.0.vm01.stdout:osd.4 up in weight 1 up_from 25 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2335964234,v1:127.0.0.1:6835/2335964234] [v2:127.0.0.1:6836/2335964234,v1:127.0.0.1:6837/2335964234] exists,up 3a85e5d3-cd37-4a5c-af17-b92c2ad31bc7 2026-03-08T22:42:02.305 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:02.305 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:02.305 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:02.305 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:143: TEST_balancer2: ceph osd set-require-min-compat-client luminous 2026-03-08T22:42:02.625 INFO:tasks.workunit.client.0.vm01.stderr:set require_min_compat_client to luminous 2026-03-08T22:42:02.639 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:144: TEST_balancer2: ceph config set mgr mgr/balancer/upmap_max_deviation 1 2026-03-08T22:42:02.866 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:145: TEST_balancer2: ceph balancer mode upmap 2026-03-08T22:42:03.092 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:146: TEST_balancer2: ceph balancer on 2026-03-08T22:42:03.313 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:147: TEST_balancer2: ceph config set mgr mgr/balancer/sleep_interval 5 2026-03-08T22:42:03.532 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:149: TEST_balancer2: create_pool test1 118 2026-03-08T22:42:03.533 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test1 118 2026-03-08T22:42:03.834 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test1' already exists 2026-03-08T22:42:03.919 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:42:04.922 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:151: TEST_balancer2: wait_for_clean 2026-03-08T22:42:04.922 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:42:04.922 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:42:04.922 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:42:04.922 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:42:04.923 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:42:04.923 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:42:04.923 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:42:04.923 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:42:04.923 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:42:04.983 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:42:04.983 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:42:04.983 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:42:04.983 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:42:04.983 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:42:04.983 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:1 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:2 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:3 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:4' 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:42:05.215 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:42:05.301 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836489 2026-03-08T22:42:05.301 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836489 2026-03-08T22:42:05.301 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489' 2026-03-08T22:42:05.301 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:42:05.301 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:42:05.387 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T22:42:05.388 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T22:42:05.388 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-42949672968' 2026-03-08T22:42:05.388 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:42:05.388 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:42:05.481 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509446 2026-03-08T22:42:05.481 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509446 2026-03-08T22:42:05.481 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-42949672968 2-64424509446' 2026-03-08T22:42:05.481 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:42:05.482 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:42:05.562 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345925 2026-03-08T22:42:05.562 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345925 2026-03-08T22:42:05.562 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-42949672968 2-64424509446 3-85899345925' 2026-03-08T22:42:05.562 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:42:05.562 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:42:05.644 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182403 2026-03-08T22:42:05.644 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182403 2026-03-08T22:42:05.644 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-42949672968 2-64424509446 3-85899345925 4-107374182403' 2026-03-08T22:42:05.644 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:42:05.644 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836489 2026-03-08T22:42:05.644 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:42:05.645 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:42:05.646 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836489 2026-03-08T22:42:05.646 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:42:05.647 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836489 2026-03-08T22:42:05.647 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836489' 2026-03-08T22:42:05.647 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.0 seq 21474836489 2026-03-08T22:42:05.647 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:42:05.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836489 2026-03-08T22:42:05.876 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:42:06.877 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:42:06.877 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:42:07.124 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836489 2026-03-08T22:42:07.125 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:42:08.125 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:42:08.126 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:42:08.343 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836489 2026-03-08T22:42:08.343 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:42:08.343 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T22:42:08.343 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:42:08.344 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:42:08.344 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T22:42:08.344 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:42:08.345 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T22:42:08.345 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T22:42:08.345 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.1 seq 42949672968 2026-03-08T22:42:08.345 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:42:08.569 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672968 -lt 42949672968 2026-03-08T22:42:08.569 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:42:08.569 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509446 2026-03-08T22:42:08.569 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:42:08.570 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:42:08.571 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509446 2026-03-08T22:42:08.571 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:42:08.572 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509446 2026-03-08T22:42:08.572 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.2 seq 64424509446 2026-03-08T22:42:08.572 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509446' 2026-03-08T22:42:08.572 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:42:08.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509447 -lt 64424509446 2026-03-08T22:42:08.812 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:42:08.812 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345925 2026-03-08T22:42:08.812 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:42:08.813 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:42:08.813 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345925 2026-03-08T22:42:08.813 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:42:08.814 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345925 2026-03-08T22:42:08.814 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345925' 2026-03-08T22:42:08.814 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.3 seq 85899345925 2026-03-08T22:42:08.814 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:42:09.039 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345925 -lt 85899345925 2026-03-08T22:42:09.039 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:42:09.040 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182403 2026-03-08T22:42:09.040 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:42:09.040 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:42:09.041 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182403 2026-03-08T22:42:09.041 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:42:09.042 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182403 2026-03-08T22:42:09.042 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.4 seq 107374182403 2026-03-08T22:42:09.042 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182403' 2026-03-08T22:42:09.042 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:42:09.267 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182404 -lt 107374182403 2026-03-08T22:42:09.267 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:42:09.267 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:42:09.267 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:42:09.574 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 118 == 0 2026-03-08T22:42:09.574 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:42:09.574 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:42:09.574 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:42:09.574 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:42:09.574 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:42:09.575 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:42:09.575 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:42:09.809 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=118 2026-03-08T22:42:09.809 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:42:09.810 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:42:09.810 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:42:10.104 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 118 = 118 2026-03-08T22:42:10.104 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:42:10.104 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:42:10.104 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:154: TEST_balancer2: OK=no 2026-03-08T22:42:10.104 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: seq 1 25 2026-03-08T22:42:10.105 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:10.105 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:15.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:15.149 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:15.149 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:20.110 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:20.112 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:20.112 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:25.113 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:25.115 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:25.115 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:30.116 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:30.119 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:30.119 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:35.120 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:35.122 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:35.161 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:40.123 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:40.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:40.126 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:45.127 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:45.129 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:45.129 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:50.130 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:50.133 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:50.133 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:42:55.134 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:42:55.136 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:42:55.136 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:43:00.137 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:43:00.140 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:155: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:43:00.140 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:157: TEST_balancer2: sleep 5 2026-03-08T22:43:05.141 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:158: TEST_balancer2: grep -q 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:43:05.143 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:160: TEST_balancer2: OK=yes 2026-03-08T22:43:05.143 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:161: TEST_balancer2: break 2026-03-08T22:43:05.143 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:164: TEST_balancer2: test yes = yes 2026-03-08T22:43:05.143 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:166: TEST_balancer2: sleep 10 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:167: TEST_balancer2: wait_for_clean 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:43:15.145 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:43:15.203 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:43:15.204 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:43:15.204 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:43:15.204 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:43:15.204 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:43:15.204 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:1 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:2 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:3 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:4' 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:15.429 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:43:15.525 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T22:43:15.525 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T22:43:15.525 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T22:43:15.525 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:15.525 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:43:15.715 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672984 2026-03-08T22:43:15.715 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672984 2026-03-08T22:43:15.715 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984' 2026-03-08T22:43:15.715 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:15.715 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:43:15.803 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509462 2026-03-08T22:43:15.808 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509462 2026-03-08T22:43:15.808 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462' 2026-03-08T22:43:15.808 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:15.808 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:43:15.897 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345941 2026-03-08T22:43:15.897 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345941 2026-03-08T22:43:15.897 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-85899345941' 2026-03-08T22:43:15.897 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:15.897 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:43:16.010 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182419 2026-03-08T22:43:16.010 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182419 2026-03-08T22:43:16.010 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-85899345941 4-107374182419' 2026-03-08T22:43:16.010 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:16.010 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T22:43:16.010 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:16.011 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:43:16.012 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T22:43:16.012 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:16.012 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T22:43:16.012 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T22:43:16.012 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.0 seq 21474836505 2026-03-08T22:43:16.013 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:43:16.241 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836503 -lt 21474836505 2026-03-08T22:43:16.241 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:43:17.242 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:43:17.242 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:43:17.478 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T22:43:17.478 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:17.478 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672984 2026-03-08T22:43:17.478 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:17.479 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:43:17.480 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672984 2026-03-08T22:43:17.480 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:17.481 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672984 2026-03-08T22:43:17.481 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.1 seq 42949672984 2026-03-08T22:43:17.481 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672984' 2026-03-08T22:43:17.481 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:43:17.717 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672984 -lt 42949672984 2026-03-08T22:43:17.717 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:17.718 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509462 2026-03-08T22:43:17.718 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:17.719 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:43:17.719 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509462 2026-03-08T22:43:17.719 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:17.720 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509462 2026-03-08T22:43:17.720 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.2 seq 64424509462 2026-03-08T22:43:17.720 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509462' 2026-03-08T22:43:17.720 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:43:17.961 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509463 -lt 64424509462 2026-03-08T22:43:17.961 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:17.962 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345941 2026-03-08T22:43:17.962 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:17.963 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:43:17.963 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345941 2026-03-08T22:43:17.963 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:17.964 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345941 2026-03-08T22:43:17.964 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.3 seq 85899345941 2026-03-08T22:43:17.964 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345941' 2026-03-08T22:43:17.965 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:43:18.195 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345941 -lt 85899345941 2026-03-08T22:43:18.195 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:18.195 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182419 2026-03-08T22:43:18.195 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:18.196 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:43:18.196 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182419 2026-03-08T22:43:18.196 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:18.197 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182419 2026-03-08T22:43:18.197 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.4 seq 107374182419 2026-03-08T22:43:18.197 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182419' 2026-03-08T22:43:18.197 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:43:18.421 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182419 -lt 107374182419 2026-03-08T22:43:18.421 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:43:18.421 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:43:18.421 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 118 == 0 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:43:18.722 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:43:18.958 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=118 2026-03-08T22:43:18.958 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:43:18.958 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:43:18.958 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:43:19.264 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 118 = 118 2026-03-08T22:43:19.264 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:43:19.264 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:43:19.264 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:168: TEST_balancer2: ceph osd df 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout:ID CLASS WEIGHT REWEIGHT SIZE RAW USE DATA OMAP META AVAIL %USE VAR PGS STATUS 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout: 0 hdd 0.09769 1.00000 100 GiB 26 MiB 272 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 71 up 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout: 1 hdd 0.09769 1.00000 100 GiB 26 MiB 272 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 71 up 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout: 2 hdd 0.09769 1.00000 100 GiB 26 MiB 272 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 70 up 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout: 3 hdd 0.09769 1.00000 100 GiB 26 MiB 272 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 71 up 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout: 4 hdd 0.09769 1.00000 100 GiB 26 MiB 272 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 71 up 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout: TOTAL 500 GiB 132 MiB 1.3 MiB 7.8 KiB 131 MiB 500 GiB 0.03 2026-03-08T22:43:19.480 INFO:tasks.workunit.client.0.vm01.stdout:MIN/MAX VAR: 1.00/1.00 STDDEV: 0 2026-03-08T22:43:19.495 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:170: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:19.495 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:170: TEST_balancer2: jq '.nodes[0].pgs' 2026-03-08T22:43:19.728 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:170: TEST_balancer2: PGS=71 2026-03-08T22:43:19.728 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:171: TEST_balancer2: test 71 -ge 70 2026-03-08T22:43:19.728 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:172: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:19.728 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:172: TEST_balancer2: jq '.nodes[1].pgs' 2026-03-08T22:43:19.963 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:172: TEST_balancer2: PGS=71 2026-03-08T22:43:19.963 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:173: TEST_balancer2: test 71 -ge 70 2026-03-08T22:43:19.963 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:174: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:19.963 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:174: TEST_balancer2: jq '.nodes[2].pgs' 2026-03-08T22:43:20.196 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:174: TEST_balancer2: PGS=70 2026-03-08T22:43:20.196 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:175: TEST_balancer2: test 70 -ge 70 2026-03-08T22:43:20.196 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:176: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:20.196 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:176: TEST_balancer2: jq '.nodes[3].pgs' 2026-03-08T22:43:20.417 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:176: TEST_balancer2: PGS=71 2026-03-08T22:43:20.417 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:177: TEST_balancer2: test 71 -ge 70 2026-03-08T22:43:20.418 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:178: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:20.418 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:178: TEST_balancer2: jq '.nodes[4].pgs' 2026-03-08T22:43:20.641 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:178: TEST_balancer2: PGS=71 2026-03-08T22:43:20.641 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:179: TEST_balancer2: test 71 -ge 70 2026-03-08T22:43:20.641 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:181: TEST_balancer2: create_pool test2 132 2026-03-08T22:43:20.641 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test2 132 2026-03-08T22:43:20.949 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test2' already exists 2026-03-08T22:43:21.020 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:43:22.024 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:184: TEST_balancer2: OK=no 2026-03-08T22:43:22.025 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:185: TEST_balancer2: seq 1 25 2026-03-08T22:43:22.025 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:185: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:43:22.025 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:187: TEST_balancer2: sleep 5 2026-03-08T22:43:27.027 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: grep 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:43:27.027 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: wc -l 2026-03-08T22:43:27.030 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: COUNT=1 2026-03-08T22:43:27.030 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:189: TEST_balancer2: test 1 = 2 2026-03-08T22:43:27.030 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:185: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:43:27.030 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:187: TEST_balancer2: sleep 5 2026-03-08T22:43:32.031 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: grep 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:43:32.031 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: wc -l 2026-03-08T22:43:32.034 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: COUNT=1 2026-03-08T22:43:32.034 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:189: TEST_balancer2: test 1 = 2 2026-03-08T22:43:32.034 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:185: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:43:32.034 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:187: TEST_balancer2: sleep 5 2026-03-08T22:43:37.035 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: grep 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:43:37.035 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: wc -l 2026-03-08T22:43:37.038 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: COUNT=1 2026-03-08T22:43:37.038 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:189: TEST_balancer2: test 1 = 2 2026-03-08T22:43:37.038 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:185: TEST_balancer2: for i in $(seq 1 25) 2026-03-08T22:43:37.038 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:187: TEST_balancer2: sleep 5 2026-03-08T22:43:42.040 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: grep 'Optimization plan is almost perfect' td/balancer/mgr.x.log 2026-03-08T22:43:42.040 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: wc -l 2026-03-08T22:43:42.043 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:188: TEST_balancer2: COUNT=2 2026-03-08T22:43:42.043 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:189: TEST_balancer2: test 2 = 2 2026-03-08T22:43:42.043 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:191: TEST_balancer2: OK=yes 2026-03-08T22:43:42.043 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:192: TEST_balancer2: break 2026-03-08T22:43:42.043 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:195: TEST_balancer2: test yes = yes 2026-03-08T22:43:42.043 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:197: TEST_balancer2: sleep 10 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:198: TEST_balancer2: wait_for_clean 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:43:52.045 INFO:tasks.workunit.client.0.vm01.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:43:52.046 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:43:52.046 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:43:52.046 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:43:52.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:43:52.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:43:52.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:43:52.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:43:52.106 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:43:52.107 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:1 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:2 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:3 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:4' 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:52.329 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:43:52.416 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836515 2026-03-08T22:43:52.416 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836515 2026-03-08T22:43:52.416 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515' 2026-03-08T22:43:52.416 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:52.417 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:43:52.508 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672993 2026-03-08T22:43:52.508 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672993 2026-03-08T22:43:52.508 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672993' 2026-03-08T22:43:52.508 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:52.508 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:43:52.594 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509472 2026-03-08T22:43:52.595 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509472 2026-03-08T22:43:52.595 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672993 2-64424509472' 2026-03-08T22:43:52.595 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:52.595 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:43:52.680 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345950 2026-03-08T22:43:52.680 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345950 2026-03-08T22:43:52.680 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672993 2-64424509472 3-85899345950' 2026-03-08T22:43:52.680 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:43:52.680 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:43:52.768 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182429 2026-03-08T22:43:52.768 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182429 2026-03-08T22:43:52.768 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672993 2-64424509472 3-85899345950 4-107374182429' 2026-03-08T22:43:52.768 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:52.768 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836515 2026-03-08T22:43:52.768 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:52.769 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:43:52.769 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836515 2026-03-08T22:43:52.769 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:52.770 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836515 2026-03-08T22:43:52.770 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.0 seq 21474836515 2026-03-08T22:43:52.770 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836515' 2026-03-08T22:43:52.770 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:43:52.991 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836515 2026-03-08T22:43:52.991 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:43:53.992 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:43:53.992 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:43:54.238 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836515 -lt 21474836515 2026-03-08T22:43:54.238 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:54.238 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672993 2026-03-08T22:43:54.239 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:54.239 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:43:54.240 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672993 2026-03-08T22:43:54.240 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:54.241 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672993 2026-03-08T22:43:54.241 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672993' 2026-03-08T22:43:54.241 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.1 seq 42949672993 2026-03-08T22:43:54.241 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:43:54.467 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672993 -lt 42949672993 2026-03-08T22:43:54.468 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:54.468 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509472 2026-03-08T22:43:54.468 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:54.469 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:43:54.469 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509472 2026-03-08T22:43:54.470 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:54.470 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509472 2026-03-08T22:43:54.470 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.2 seq 64424509472 2026-03-08T22:43:54.471 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509472' 2026-03-08T22:43:54.471 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:43:54.738 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509472 -lt 64424509472 2026-03-08T22:43:54.738 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:54.739 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345950 2026-03-08T22:43:54.739 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:54.740 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:43:54.740 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345950 2026-03-08T22:43:54.741 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:54.741 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345950 2026-03-08T22:43:54.741 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.3 seq 85899345950 2026-03-08T22:43:54.742 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345950' 2026-03-08T22:43:54.742 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:43:54.971 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345950 -lt 85899345950 2026-03-08T22:43:54.971 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:43:54.971 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182429 2026-03-08T22:43:54.972 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:43:54.972 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:43:54.973 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182429 2026-03-08T22:43:54.973 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:43:54.974 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182429 2026-03-08T22:43:54.974 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182429' 2026-03-08T22:43:54.974 INFO:tasks.workunit.client.0.vm01.stdout:waiting osd.4 seq 107374182429 2026-03-08T22:43:54.974 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:43:55.201 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182429 -lt 107374182429 2026-03-08T22:43:55.201 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:43:55.201 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:43:55.202 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:43:55.499 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 250 == 0 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:43:55.500 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:43:55.741 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=250 2026-03-08T22:43:55.741 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:43:55.741 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:43:55.741 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:43:56.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 250 = 250 2026-03-08T22:43:56.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:43:56.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:43:56.063 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:199: TEST_balancer2: ceph osd df 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout:ID CLASS WEIGHT REWEIGHT SIZE RAW USE DATA OMAP META AVAIL %USE VAR PGS STATUS 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout: 0 hdd 0.09769 1.00000 100 GiB 27 MiB 376 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 151 up 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout: 1 hdd 0.09769 1.00000 100 GiB 27 MiB 376 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 150 up 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout: 2 hdd 0.09769 1.00000 100 GiB 27 MiB 376 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 149 up 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout: 3 hdd 0.09769 1.00000 100 GiB 27 MiB 376 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 150 up 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout: 4 hdd 0.09769 1.00000 100 GiB 27 MiB 376 KiB 1 KiB 26 MiB 100 GiB 0.03 1.00 150 up 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout: TOTAL 500 GiB 133 MiB 1.8 MiB 7.8 KiB 131 MiB 500 GiB 0.03 2026-03-08T22:43:56.269 INFO:tasks.workunit.client.0.vm01.stdout:MIN/MAX VAR: 1.00/1.00 STDDEV: 0 2026-03-08T22:43:56.284 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:203: TEST_balancer2: expr 150 - 2 2026-03-08T22:43:56.285 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:203: TEST_balancer2: MIN=148 2026-03-08T22:43:56.285 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:204: TEST_balancer2: expr 150 + 2 2026-03-08T22:43:56.286 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:204: TEST_balancer2: MAX=152 2026-03-08T22:43:56.286 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:205: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:56.286 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:205: TEST_balancer2: jq '.nodes[0].pgs' 2026-03-08T22:43:56.506 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:205: TEST_balancer2: PGS=151 2026-03-08T22:43:56.506 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:206: TEST_balancer2: test 151 -ge 148 -a 151 -le 152 2026-03-08T22:43:56.506 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:207: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:56.506 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:207: TEST_balancer2: jq '.nodes[1].pgs' 2026-03-08T22:43:56.733 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:207: TEST_balancer2: PGS=150 2026-03-08T22:43:56.733 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:208: TEST_balancer2: test 150 -ge 148 -a 150 -le 152 2026-03-08T22:43:56.733 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:209: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:56.733 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:209: TEST_balancer2: jq '.nodes[2].pgs' 2026-03-08T22:43:56.965 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:209: TEST_balancer2: PGS=149 2026-03-08T22:43:56.965 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:210: TEST_balancer2: test 149 -ge 148 -a 149 -le 152 2026-03-08T22:43:56.965 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:211: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:56.965 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:211: TEST_balancer2: jq '.nodes[3].pgs' 2026-03-08T22:43:57.189 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:211: TEST_balancer2: PGS=150 2026-03-08T22:43:57.189 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:212: TEST_balancer2: test 150 -ge 148 -a 150 -le 152 2026-03-08T22:43:57.189 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:213: TEST_balancer2: ceph osd df --format=json-pretty 2026-03-08T22:43:57.189 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:213: TEST_balancer2: jq '.nodes[4].pgs' 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:213: TEST_balancer2: PGS=150 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:214: TEST_balancer2: test 150 -ge 148 -a 150 -le 152 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh:216: TEST_balancer2: teardown td/balancer 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/balancer 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/balancer KILL 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:43:57.414 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:43:57.532 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:43:57.532 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:43:57.533 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:43:57.533 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:43:57.533 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:43:57.534 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:43:57.534 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:43:57.534 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:43:57.534 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:43:57.535 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:43:57.535 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:43:57.535 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:43:57.536 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:43:57.536 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/balancer 2026-03-08T22:43:57.575 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:43:57.575 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:57.575 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:43:57.575 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19264 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/balancer 0 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/balancer 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:43:57.576 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/balancer KILL 2026-03-08T22:43:57.577 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:43:57.577 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:43:57.577 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:43:57.577 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:43:57.577 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:43:57.579 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:43:57.579 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:43:57.580 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:43:57.580 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:43:57.580 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:43:57.580 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:43:57.580 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:43:57.581 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:43:57.581 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:43:57.581 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:43:57.582 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:43:57.582 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:43:57.583 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:43:57.583 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/balancer 2026-03-08T22:43:57.584 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:43:57.584 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:57.584 INFO:tasks.workunit.client.0.vm01.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19264 2026-03-08T22:43:57.584 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19264 2026-03-08T22:43:57.585 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:43:57.585 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:43:57.585 INFO:tasks.workunit.client.0.vm01.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:43:57.585 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:43:57.585 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:43:57.636 INFO:tasks.workunit:Stopping ['mgr'] on client.0... 2026-03-08T22:43:57.636 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-08T22:43:58.090 DEBUG:teuthology.parallel:result is None 2026-03-08T22:43:58.090 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:43:58.100 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:43:58.100 DEBUG:teuthology.orchestra.run.vm01:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:43:58.144 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:43:58.144 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-08T22:43:58.148 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-08T22:43:58.148 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-08T22:43:58.199 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-08T22:43:58.199 DEBUG:teuthology.orchestra.run.vm01:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-08T22:43:58.267 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:43:58.440 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:43:58.440 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:43:58.551 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:43:58.551 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:43:58.551 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:43:58.551 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:43:58.564 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:43:58.565 INFO:teuthology.orchestra.run.vm01.stdout: ceph* 2026-03-08T22:43:58.760 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:43:58.760 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-08T22:43:58.806 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118605 files and directories currently installed.) 2026-03-08T22:43:58.808 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:59.928 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:43:59.961 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:00.146 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:00.146 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:00.326 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:00.326 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:44:00.327 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-08T22:44:00.327 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:00.337 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:00.338 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-cephadm* cephadm* 2026-03-08T22:44:00.522 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-08T22:44:00.522 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 1775 kB disk space will be freed. 2026-03-08T22:44:00.562 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118603 files and directories currently installed.) 2026-03-08T22:44:00.565 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:00.584 INFO:teuthology.orchestra.run.vm01.stdout:Removing cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:00.612 INFO:teuthology.orchestra.run.vm01.stdout:Looking for files to backup/remove ... 2026-03-08T22:44:00.613 INFO:teuthology.orchestra.run.vm01.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-08T22:44:00.615 INFO:teuthology.orchestra.run.vm01.stdout:Removing user `cephadm' ... 2026-03-08T22:44:00.615 INFO:teuthology.orchestra.run.vm01.stdout:Warning: group `nogroup' has no more members. 2026-03-08T22:44:00.625 INFO:teuthology.orchestra.run.vm01.stdout:Done. 2026-03-08T22:44:00.648 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:00.748 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-08T22:44:00.751 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:01.882 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:01.916 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:02.140 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:02.140 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:02.390 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:02.391 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:44:02.391 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-08T22:44:02.391 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:02.403 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:02.404 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mds* 2026-03-08T22:44:02.590 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:44:02.590 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 7437 kB disk space will be freed. 2026-03-08T22:44:02.638 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-08T22:44:02.641 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.202 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:03.354 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-08T22:44:03.356 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:05.005 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:05.046 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:05.274 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:05.275 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:05.483 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:05.483 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools python3-cheroot 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-psutil python3-pyinotify 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-threadpoolctl python3-waitress python3-webob python3-websocket 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev 2026-03-08T22:44:05.484 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:05.496 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:05.497 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-08T22:44:05.497 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-k8sevents* 2026-03-08T22:44:05.682 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 4 to remove and 10 not upgraded. 2026-03-08T22:44:05.682 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 165 MB disk space will be freed. 2026-03-08T22:44:05.727 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-08T22:44:05.729 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:05.743 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:05.768 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:05.805 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:06.343 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-08T22:44:06.346 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:07.982 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:08.016 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:08.205 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:08.206 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:08.372 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:08.372 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:08.372 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:44:08.373 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:08.386 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:08.387 INFO:teuthology.orchestra.run.vm01.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-08T22:44:08.560 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-08T22:44:08.561 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 472 MB disk space will be freed. 2026-03-08T22:44:08.602 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-08T22:44:08.605 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:08.671 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:09.084 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:09.558 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:09.960 INFO:teuthology.orchestra.run.vm01.stdout:Removing radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:10.347 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:10.384 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:10.840 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:10.878 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:44:10.954 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117455 files and directories currently installed.) 2026-03-08T22:44:10.957 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:11.624 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:12.079 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:12.528 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:12.968 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:14.572 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:14.604 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:14.796 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:14.797 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:15.011 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:15.011 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:15.011 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:44:15.012 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:44:15.013 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:44:15.013 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:44:15.013 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:44:15.013 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:15.027 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:15.028 INFO:teuthology.orchestra.run.vm01.stdout: ceph-fuse* 2026-03-08T22:44:15.199 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:44:15.199 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 3673 kB disk space will be freed. 2026-03-08T22:44:15.245 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117443 files and directories currently installed.) 2026-03-08T22:44:15.248 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:15.662 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:15.768 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-08T22:44:15.769 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:17.247 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:17.284 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:17.478 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:17.478 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:17.609 INFO:teuthology.orchestra.run.vm01.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-08T22:44:17.609 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:17.609 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:17.609 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:44:17.610 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:17.635 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:17.635 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:17.668 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:17.854 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:17.855 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:44:18.019 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:44:18.020 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:44:18.020 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:18.039 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:18.039 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:18.072 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:18.265 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:18.266 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:18.380 INFO:teuthology.orchestra.run.vm01.stdout:Package 'radosgw' is not installed, so not removed 2026-03-08T22:44:18.380 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:18.380 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:18.380 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:44:18.381 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:18.395 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:18.395 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:18.430 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:18.625 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:18.625 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:18.768 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:18.769 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:44:18.769 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-08T22:44:18.769 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:18.780 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:18.780 INFO:teuthology.orchestra.run.vm01.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-08T22:44:18.947 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 3 to remove and 10 not upgraded. 2026-03-08T22:44:18.947 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 2062 kB disk space will be freed. 2026-03-08T22:44:18.995 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-08T22:44:18.997 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:19.010 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:19.022 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:20.158 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:20.194 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:20.400 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:20.401 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-08T22:44:20.547 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:20.570 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:20.570 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:20.601 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:20.757 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:20.757 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-08T22:44:20.911 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:20.926 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:20.926 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:20.961 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:21.157 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:21.158 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:21.311 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:21.311 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:21.312 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:44:21.312 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-08T22:44:21.313 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:21.328 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:21.328 INFO:teuthology.orchestra.run.vm01.stdout: python3-rbd* 2026-03-08T22:44:21.509 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:44:21.509 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 1186 kB disk space will be freed. 2026-03-08T22:44:21.550 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117410 files and directories currently installed.) 2026-03-08T22:44:21.552 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:22.701 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:22.736 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:22.948 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:22.948 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:23.113 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:23.113 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:23.113 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:44:23.113 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-08T22:44:23.114 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:23.121 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:23.121 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-dev* libcephfs2* 2026-03-08T22:44:23.295 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-08T22:44:23.295 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 3202 kB disk space will be freed. 2026-03-08T22:44:23.342 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117402 files and directories currently installed.) 2026-03-08T22:44:23.345 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:23.377 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:23.432 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:44:24.684 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:24.721 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:24.962 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:24.962 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:25.214 INFO:teuthology.orchestra.run.vm01.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:25.215 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-08T22:44:25.216 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:25.241 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:25.241 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:25.276 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:25.489 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:25.489 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:44:25.661 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:25.668 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:25.669 INFO:teuthology.orchestra.run.vm01.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-08T22:44:25.669 INFO:teuthology.orchestra.run.vm01.stdout: qemu-block-extra* rbd-fuse* 2026-03-08T22:44:25.839 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-08T22:44:25.840 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 51.6 MB disk space will be freed. 2026-03-08T22:44:25.888 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117387 files and directories currently installed.) 2026-03-08T22:44:25.891 INFO:teuthology.orchestra.run.vm01.stdout:Removing rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:25.909 INFO:teuthology.orchestra.run.vm01.stdout:Removing libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:25.922 INFO:teuthology.orchestra.run.vm01.stdout:Removing libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:25.934 INFO:teuthology.orchestra.run.vm01.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-08T22:44:26.389 INFO:teuthology.orchestra.run.vm01.stdout:Removing librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:26.401 INFO:teuthology.orchestra.run.vm01.stdout:Removing librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:26.414 INFO:teuthology.orchestra.run.vm01.stdout:Removing librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:26.441 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:26.474 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:44:26.578 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-08T22:44:26.580 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-08T22:44:28.179 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:28.216 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:28.432 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:28.432 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:28.568 INFO:teuthology.orchestra.run.vm01.stdout:Package 'librbd1' is not installed, so not removed 2026-03-08T22:44:28.568 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:28.568 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:28.568 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:44:28.568 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:44:28.568 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:44:28.569 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:28.589 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:28.589 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:28.628 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:28.865 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:28.865 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:28.992 INFO:teuthology.orchestra.run.vm01.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-08T22:44:28.992 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:28.992 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:28.992 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:44:28.992 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:44:28.992 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:44:28.993 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:29.007 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:29.007 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:29.009 DEBUG:teuthology.orchestra.run.vm01:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-08T22:44:29.060 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-08T22:44:29.138 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:29.361 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-08T22:44:29.361 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-08T22:44:29.504 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-08T22:44:29.504 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:44:29.504 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:44:29.504 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:44:29.504 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:44:29.504 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:44:29.505 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:44:29.680 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 87 to remove and 10 not upgraded. 2026-03-08T22:44:29.681 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 107 MB disk space will be freed. 2026-03-08T22:44:29.724 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-08T22:44:29.726 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:29.743 INFO:teuthology.orchestra.run.vm01.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:44:29.755 INFO:teuthology.orchestra.run.vm01.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-08T22:44:29.768 INFO:teuthology.orchestra.run.vm01.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-08T22:44:29.780 INFO:teuthology.orchestra.run.vm01.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-08T22:44:29.793 INFO:teuthology.orchestra.run.vm01.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:44:29.805 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:44:29.818 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:44:29.831 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:44:29.855 INFO:teuthology.orchestra.run.vm01.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:44:29.866 INFO:teuthology.orchestra.run.vm01.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:44:29.878 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:44:29.888 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:44:29.899 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:44:29.909 INFO:teuthology.orchestra.run.vm01.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:44:29.920 INFO:teuthology.orchestra.run.vm01.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-08T22:44:29.931 INFO:teuthology.orchestra.run.vm01.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:44:29.942 INFO:teuthology.orchestra.run.vm01.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:44:29.953 INFO:teuthology.orchestra.run.vm01.stdout:Removing luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:44:29.976 INFO:teuthology.orchestra.run.vm01.stdout:Removing liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:44:29.988 INFO:teuthology.orchestra.run.vm01.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-08T22:44:29.998 INFO:teuthology.orchestra.run.vm01.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:44:30.009 INFO:teuthology.orchestra.run.vm01.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:44:30.021 INFO:teuthology.orchestra.run.vm01.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:44:30.032 INFO:teuthology.orchestra.run.vm01.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-08T22:44:30.044 INFO:teuthology.orchestra.run.vm01.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:44:30.056 INFO:teuthology.orchestra.run.vm01.stdout:Removing libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:44:30.067 INFO:teuthology.orchestra.run.vm01.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-08T22:44:30.076 INFO:teuthology.orchestra.run.vm01.stdout:update-initramfs: deferring update (trigger activated) 2026-03-08T22:44:30.087 INFO:teuthology.orchestra.run.vm01.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-08T22:44:30.109 INFO:teuthology.orchestra.run.vm01.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-08T22:44:30.123 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua-any (27ubuntu1) ... 2026-03-08T22:44:30.135 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:44:30.147 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:44:30.164 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:44:30.181 INFO:teuthology.orchestra.run.vm01.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:44:30.608 INFO:teuthology.orchestra.run.vm01.stdout:Removing pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:44:30.643 INFO:teuthology.orchestra.run.vm01.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:44:30.671 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:44:30.737 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-webtest (2.0.35-1) ... 2026-03-08T22:44:30.788 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pastescript (2.0.2-4) ... 2026-03-08T22:44:30.839 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:44:30.890 INFO:teuthology.orchestra.run.vm01.stdout:Removing python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:44:30.967 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:44:31.025 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:44:31.306 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-08T22:44:31.375 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-08T22:44:31.434 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:31.492 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:31.562 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:44:31.634 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:44:31.699 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:44:31.760 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:44:31.817 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-08T22:44:31.876 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-08T22:44:31.946 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:44:32.001 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:44:32.054 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:44:32.188 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:44:32.249 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-logutils (0.3.3-8) ... 2026-03-08T22:44:32.298 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:44:32.350 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-08T22:44:32.398 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:44:32.455 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-08T22:44:32.500 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-08T22:44:32.552 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:44:32.607 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:44:32.660 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-08T22:44:32.715 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:44:32.768 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rsa (4.8-1) ... 2026-03-08T22:44:32.825 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:44:32.873 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:44:32.926 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:44:32.975 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:44:33.001 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:44:33.047 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:44:33.093 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:44:33.141 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:44:33.192 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:44:33.242 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-08T22:44:33.295 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:44:33.353 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-08T22:44:33.400 INFO:teuthology.orchestra.run.vm01.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:44:33.422 INFO:teuthology.orchestra.run.vm01.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:44:33.801 INFO:teuthology.orchestra.run.vm01.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:44:33.813 INFO:teuthology.orchestra.run.vm01.stdout:Removing unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:44:33.833 INFO:teuthology.orchestra.run.vm01.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:44:33.851 INFO:teuthology.orchestra.run.vm01.stdout:Removing zip (3.0-12build2) ... 2026-03-08T22:44:33.877 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:44:33.888 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:33.988 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-08T22:44:34.002 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-08T22:44:34.025 INFO:teuthology.orchestra.run.vm01.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-1092-kvm 2026-03-08T22:44:35.631 INFO:teuthology.orchestra.run.vm01.stdout:W: mkconf: MD subsystem is not loaded, thus I cannot scan for arrays. 2026-03-08T22:44:35.632 INFO:teuthology.orchestra.run.vm01.stdout:W: mdadm: failed to auto-generate temporary mdadm.conf file. 2026-03-08T22:44:37.657 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:37.660 DEBUG:teuthology.parallel:result is None 2026-03-08T22:44:37.660 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm01.local 2026-03-08T22:44:37.660 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-08T22:44:37.708 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-get update 2026-03-08T22:44:37.886 INFO:teuthology.orchestra.run.vm01.stdout:Hit:1 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-08T22:44:38.016 INFO:teuthology.orchestra.run.vm01.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-08T22:44:38.051 INFO:teuthology.orchestra.run.vm01.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-08T22:44:38.087 INFO:teuthology.orchestra.run.vm01.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-08T22:44:38.966 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-08T22:44:38.977 DEBUG:teuthology.parallel:result is None 2026-03-08T22:44:38.978 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-08T22:44:38.980 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-08T22:44:38.980 DEBUG:teuthology.orchestra.run.vm01:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:44:39.876 INFO:teuthology.orchestra.run.vm01.stdout: remote refid st t when poll reach delay offset jitter 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:============================================================================== 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+158.101.188.125 131.188.3.220 2 u 64 64 37 21.013 +0.017 5.874 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+s7.vonderste.in 131.188.3.222 2 u 58 64 37 28.301 +1.431 3.357 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+netcup01.therav 171.237.1.87 2 u 56 64 37 28.262 -1.474 3.997 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+ntp3.uni-ulm.de 129.69.253.1 2 u 52 64 37 28.107 +0.289 4.032 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+81.3.27.46 (ntp 131.188.3.220 2 u 58 64 37 27.917 +0.916 4.427 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+basilisk.mybb.d 171.237.1.87 2 u 56 64 37 28.311 -1.438 3.599 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+server1b.meinbe 131.188.3.222 2 u 63 64 37 23.576 +0.808 4.375 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+alpha.rueckgr.a 131.188.3.222 2 u 58 64 37 28.076 -0.650 5.549 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+time4.beanman.n 30.20.35.61 2 u 54 64 37 28.409 -1.187 3.098 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:*ntp3.lwlcom.net .GPS. 1 u 53 64 37 31.045 +4.192 3.095 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+139-162-156-95. 81.104.22.229 2 u 56 64 37 22.601 -4.375 3.351 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+nur1.aup.dk 131.188.3.222 2 u 60 64 37 23.849 +0.785 4.448 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+ntp2.adminforge 131.188.3.220 2 u 58 64 37 27.941 +1.949 3.044 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+alphyn.canonica 132.163.96.1 2 u 1 64 77 102.014 -1.968 3.310 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+185.252.140.125 216.239.35.4 2 u 57 64 37 25.270 +1.034 4.060 2026-03-08T22:44:39.906 INFO:teuthology.orchestra.run.vm01.stdout:+timegoesbrrr.ne 131.188.3.221 2 u 59 64 37 28.364 +1.605 2.815 2026-03-08T22:44:39.907 INFO:teuthology.orchestra.run.vm01.stdout:+185.125.190.56 79.243.60.50 2 u 1 64 77 35.674 -0.503 3.908 2026-03-08T22:44:39.907 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-08T22:44:39.909 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-08T22:44:39.910 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-08T22:44:39.916 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-08T22:44:39.919 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-08T22:44:39.925 INFO:teuthology.task.internal:Duration was 461.961915 seconds 2026-03-08T22:44:39.925 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-08T22:44:39.928 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-08T22:44:39.928 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-08T22:44:39.954 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-08T22:44:39.954 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm01.local 2026-03-08T22:44:39.954 DEBUG:teuthology.orchestra.run.vm01:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-08T22:44:40.009 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-08T22:44:40.010 DEBUG:teuthology.orchestra.run.vm01:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:44:40.073 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-08T22:44:40.073 DEBUG:teuthology.orchestra.run.vm01:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-08T22:44:40.122 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:44:40.123 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:44:40.123 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0%gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:44:40.123 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-08T22:44:40.123 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-08T22:44:40.127 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 85.1% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-08T22:44:40.128 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-08T22:44:40.132 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-08T22:44:40.132 DEBUG:teuthology.orchestra.run.vm01:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-08T22:44:40.177 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-08T22:44:40.181 DEBUG:teuthology.orchestra.run.vm01:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:44:40.229 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern = core 2026-03-08T22:44:40.238 DEBUG:teuthology.orchestra.run.vm01:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:44:40.285 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:44:40.285 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-08T22:44:40.288 INFO:teuthology.task.internal:Transferring archived files... 2026-03-08T22:44:40.289 DEBUG:teuthology.misc:Transferring archived files from vm01:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/277/remote/vm01 2026-03-08T22:44:40.289 DEBUG:teuthology.orchestra.run.vm01:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-08T22:44:40.337 INFO:teuthology.task.internal:Removing archive directory... 2026-03-08T22:44:40.337 DEBUG:teuthology.orchestra.run.vm01:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-08T22:44:40.384 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-08T22:44:40.398 INFO:teuthology.task.internal:Not uploading archives. 2026-03-08T22:44:40.398 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-08T22:44:40.409 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-08T22:44:40.409 DEBUG:teuthology.orchestra.run.vm01:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-08T22:44:40.428 INFO:teuthology.orchestra.run.vm01.stdout: 258068 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 8 22:44 /home/ubuntu/cephtest 2026-03-08T22:44:40.455 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-08T22:44:40.493 INFO:teuthology.run:Summary data: description: rados:standalone/{supported-random-distro$/{ubuntu_latest} workloads/mgr} duration: 461.9619154930115 flavor: default owner: kyr success: true 2026-03-08T22:44:40.493 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:44:40.516 INFO:teuthology.run:pass