2026-03-08T22:39:36.753 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T22:39:36.757 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:39:36.806 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/281 branch: squid description: rados:standalone/{supported-random-distro$/{ubuntu_latest} workloads/osd-backfill} email: null first_in_suite: false flavor: default job_id: '281' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 5909 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 suite: rados:standalone suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm06.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBqoJsifzIvqk9K2EFEKXO2ZUquZHP3OlSA4bEaZuR1sPUCn8SYWGtpuwX1+oRVxYe08hf9r7YzzHRfV9wjSpXs= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - osd-backfill teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_21:49:43 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-08T22:39:36.806 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T22:39:36.806 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T22:39:36.807 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T22:39:36.807 INFO:teuthology.task.internal:Checking packages... 2026-03-08T22:39:36.807 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T22:39:36.807 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T22:39:36.807 INFO:teuthology.packaging:ref: None 2026-03-08T22:39:36.807 INFO:teuthology.packaging:tag: None 2026-03-08T22:39:36.807 INFO:teuthology.packaging:branch: squid 2026-03-08T22:39:36.807 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:39:36.807 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=squid 2026-03-08T22:39:37.438 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:39:37.439 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T22:39:37.451 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T22:39:37.451 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T22:39:37.464 INFO:teuthology.task.internal:Saving configuration 2026-03-08T22:39:37.468 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T22:39:37.488 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T22:39:37.495 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm06.local', 'description': '/archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/281', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 22:38:53.939779', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:06', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBqoJsifzIvqk9K2EFEKXO2ZUquZHP3OlSA4bEaZuR1sPUCn8SYWGtpuwX1+oRVxYe08hf9r7YzzHRfV9wjSpXs='} 2026-03-08T22:39:37.495 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T22:39:37.509 INFO:teuthology.task.internal:roles: ubuntu@vm06.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-08T22:39:37.509 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T22:39:37.517 DEBUG:teuthology.task.console_log:vm06 does not support IPMI; excluding 2026-03-08T22:39:37.518 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f7fe3588f70>, signals=[15]) 2026-03-08T22:39:37.518 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T22:39:37.518 INFO:teuthology.task.internal:Opening connections... 2026-03-08T22:39:37.518 DEBUG:teuthology.task.internal:connecting to ubuntu@vm06.local 2026-03-08T22:39:37.519 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:39:37.582 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T22:39:37.583 DEBUG:teuthology.orchestra.run.vm06:> uname -m 2026-03-08T22:39:37.704 INFO:teuthology.orchestra.run.vm06.stdout:x86_64 2026-03-08T22:39:37.705 DEBUG:teuthology.orchestra.run.vm06:> cat /etc/os-release 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:NAME="Ubuntu" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:VERSION_ID="22.04" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:VERSION_CODENAME=jammy 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:ID=ubuntu 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:ID_LIKE=debian 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-08T22:39:37.748 INFO:teuthology.orchestra.run.vm06.stdout:UBUNTU_CODENAME=jammy 2026-03-08T22:39:37.748 INFO:teuthology.lock.ops:Updating vm06.local on lock server 2026-03-08T22:39:37.753 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T22:39:37.754 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T22:39:37.755 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T22:39:37.755 DEBUG:teuthology.orchestra.run.vm06:> test '!' -e /home/ubuntu/cephtest 2026-03-08T22:39:37.791 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T22:39:37.792 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T22:39:37.792 DEBUG:teuthology.orchestra.run.vm06:> test -z $(ls -A /var/lib/ceph) 2026-03-08T22:39:37.835 INFO:teuthology.orchestra.run.vm06.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T22:39:37.836 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T22:39:37.842 DEBUG:teuthology.orchestra.run.vm06:> test -e /ceph-qa-ready 2026-03-08T22:39:37.879 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:39:38.117 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T22:39:38.118 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T22:39:38.118 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T22:39:38.122 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T22:39:38.123 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T22:39:38.124 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T22:39:38.124 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T22:39:38.169 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T22:39:38.171 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T22:39:38.171 DEBUG:teuthology.orchestra.run.vm06:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T22:39:38.211 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:39:38.212 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T22:39:38.261 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:39:38.266 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:39:38.267 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T22:39:38.268 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T22:39:38.268 DEBUG:teuthology.orchestra.run.vm06:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T22:39:38.317 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T22:39:38.319 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T22:39:38.319 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T22:39:38.364 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:39:38.408 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:39:38.455 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:39:38.456 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T22:39:38.507 DEBUG:teuthology.orchestra.run.vm06:> sudo service rsyslog restart 2026-03-08T22:39:38.569 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T22:39:38.572 INFO:teuthology.task.internal:Starting timer... 2026-03-08T22:39:38.572 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T22:39:38.576 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T22:39:38.616 INFO:teuthology.task.selinux:Excluding vm06: VMs are not yet supported 2026-03-08T22:39:38.616 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T22:39:38.616 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T22:39:38.616 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T22:39:38.616 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T22:39:38.652 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T22:39:38.652 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-08T22:39:38.654 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-08T22:39:39.112 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T22:39:39.118 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T22:39:39.118 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryd822lw53 --limit vm06.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T22:41:55.145 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm06.local')] 2026-03-08T22:41:55.145 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm06.local' 2026-03-08T22:41:55.146 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:41:55.206 DEBUG:teuthology.orchestra.run.vm06:> true 2026-03-08T22:41:55.416 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm06.local' 2026-03-08T22:41:55.417 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T22:41:55.419 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T22:41:55.419 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T22:41:55.419 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Command line: ntpd -gq 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: ---------------------------------------------------- 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: ntp-4 is maintained by Network Time Foundation, 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: corporation. Support and training for ntp-4 are 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: available at https://www.nwtime.org/support 2026-03-08T22:41:55.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: ---------------------------------------------------- 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: proto: precision = 0.040 usec (-24) 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: basedate set to 2022-02-04 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: gps base set to 2022-02-06 (week 2196) 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stderr: 8 Mar 22:41:55 ntpd[15965]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 71 days ago 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listen and drop on 0 v6wildcard [::]:123 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listen normally on 2 lo 127.0.0.1:123 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listen normally on 3 ens3 192.168.123.106:123 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listen normally on 4 lo [::1]:123 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:6%2]:123 2026-03-08T22:41:55.480 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:55 ntpd[15965]: Listening on routing socket on fd #22 for interface updates 2026-03-08T22:41:56.479 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:56 ntpd[15965]: Soliciting pool server 93.241.86.156 2026-03-08T22:41:57.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:57 ntpd[15965]: Soliciting pool server 193.32.222.35 2026-03-08T22:41:57.478 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:57 ntpd[15965]: Soliciting pool server 217.115.11.162 2026-03-08T22:41:58.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:58 ntpd[15965]: Soliciting pool server 49.12.125.53 2026-03-08T22:41:58.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:58 ntpd[15965]: Soliciting pool server 51.75.67.47 2026-03-08T22:41:58.478 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:58 ntpd[15965]: Soliciting pool server 185.216.176.59 2026-03-08T22:41:59.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:59 ntpd[15965]: Soliciting pool server 217.197.91.176 2026-03-08T22:41:59.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:59 ntpd[15965]: Soliciting pool server 185.252.140.126 2026-03-08T22:41:59.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:59 ntpd[15965]: Soliciting pool server 152.53.15.80 2026-03-08T22:41:59.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:41:59 ntpd[15965]: Soliciting pool server 185.232.69.65 2026-03-08T22:42:00.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:00 ntpd[15965]: Soliciting pool server 168.119.211.223 2026-03-08T22:42:00.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:00 ntpd[15965]: Soliciting pool server 159.195.55.239 2026-03-08T22:42:00.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:00 ntpd[15965]: Soliciting pool server 85.215.166.214 2026-03-08T22:42:00.477 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:00 ntpd[15965]: Soliciting pool server 185.125.190.57 2026-03-08T22:42:01.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:01 ntpd[15965]: Soliciting pool server 185.125.190.56 2026-03-08T22:42:01.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:01 ntpd[15965]: Soliciting pool server 172.104.149.161 2026-03-08T22:42:01.476 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:01 ntpd[15965]: Soliciting pool server 172.104.134.72 2026-03-08T22:42:03.517 INFO:teuthology.orchestra.run.vm06.stdout: 8 Mar 22:42:03 ntpd[15965]: ntpd: time slew +0.015780 s 2026-03-08T22:42:03.517 INFO:teuthology.orchestra.run.vm06.stdout:ntpd: time slew +0.015780s 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout: remote refid st t when poll reach delay offset jitter 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout:============================================================================== 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:42:03.539 INFO:teuthology.orchestra.run.vm06.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:42:03.539 INFO:teuthology.run_tasks:Running task install... 2026-03-08T22:42:03.541 DEBUG:teuthology.task.install:project ceph 2026-03-08T22:42:03.541 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:42:03.541 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:42:03.541 INFO:teuthology.task.install:Using flavor: default 2026-03-08T22:42:03.543 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T22:42:03.543 INFO:teuthology.task.install:extra packages: [] 2026-03-08T22:42:03.544 DEBUG:teuthology.orchestra.run.vm06:> sudo apt-key list | grep Ceph 2026-03-08T22:42:03.614 INFO:teuthology.orchestra.run.vm06.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-08T22:42:03.633 INFO:teuthology.orchestra.run.vm06.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-08T22:42:03.633 INFO:teuthology.orchestra.run.vm06.stdout:uid [ unknown] Ceph.com (release key) 2026-03-08T22:42:03.634 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-08T22:42:03.634 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-xmltodict, python3-jmespath on remote deb x86_64 2026-03-08T22:42:03.634 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:42:04.223 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default/ 2026-03-08T22:42:04.223 INFO:teuthology.task.install.deb:Package version is 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:42:04.747 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:42:04.747 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-08T22:42:04.754 DEBUG:teuthology.orchestra.run.vm06:> sudo apt-get update 2026-03-08T22:42:05.051 INFO:teuthology.orchestra.run.vm06.stdout:Hit:1 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-08T22:42:05.052 INFO:teuthology.orchestra.run.vm06.stdout:Hit:2 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-08T22:42:05.082 INFO:teuthology.orchestra.run.vm06.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-08T22:42:05.118 INFO:teuthology.orchestra.run.vm06.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-08T22:42:05.498 INFO:teuthology.orchestra.run.vm06.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy InRelease 2026-03-08T22:42:05.619 INFO:teuthology.orchestra.run.vm06.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release [7662 B] 2026-03-08T22:42:05.740 INFO:teuthology.orchestra.run.vm06.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-08T22:42:05.860 INFO:teuthology.orchestra.run.vm06.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.1 kB] 2026-03-08T22:42:05.936 INFO:teuthology.orchestra.run.vm06.stdout:Fetched 25.8 kB in 1s (25.1 kB/s) 2026-03-08T22:42:06.605 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:42:06.616 DEBUG:teuthology.orchestra.run.vm06:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=19.2.3-678-ge911bdeb-1jammy cephadm=19.2.3-678-ge911bdeb-1jammy ceph-mds=19.2.3-678-ge911bdeb-1jammy ceph-mgr=19.2.3-678-ge911bdeb-1jammy ceph-common=19.2.3-678-ge911bdeb-1jammy ceph-fuse=19.2.3-678-ge911bdeb-1jammy ceph-test=19.2.3-678-ge911bdeb-1jammy ceph-volume=19.2.3-678-ge911bdeb-1jammy radosgw=19.2.3-678-ge911bdeb-1jammy python3-rados=19.2.3-678-ge911bdeb-1jammy python3-rgw=19.2.3-678-ge911bdeb-1jammy python3-cephfs=19.2.3-678-ge911bdeb-1jammy python3-rbd=19.2.3-678-ge911bdeb-1jammy libcephfs2=19.2.3-678-ge911bdeb-1jammy libcephfs-dev=19.2.3-678-ge911bdeb-1jammy librados2=19.2.3-678-ge911bdeb-1jammy librbd1=19.2.3-678-ge911bdeb-1jammy rbd-fuse=19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:42:06.647 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:42:06.823 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:42:06.824 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:42:06.961 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:42:06.961 INFO:teuthology.orchestra.run.vm06.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:42:06.961 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:42:06.961 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout:The following additional packages will be installed: 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: liboath0 libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsqlite3-mod-ceph 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: libthrift-0.16.0 lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-pytest python3-repoze.lru 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:42:06.962 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml python3-waitress python3-wcwidth python3-webob 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket python3-webtest python3-werkzeug python3-zc.lockfile 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: qttranslations5-l10n smartmontools socat unzip xmlstarlet zip 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout:Suggested packages: 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python3-influxdb readline-doc python3-beaker python-mako-doc 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python-natsort-doc httpd-wsgi libapache2-mod-python libapache2-mod-scgi 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: libjs-mochikit python-pecan-doc python-psutil-doc subversion 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python-pygments-doc ttf-bitstream-vera python-pyinotify-doc python3-dap 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python-sklearn-doc ipython3 python-waitress-doc python-webob-doc 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: python-webtest-doc python-werkzeug-doc python3-watchdog gsmartcontrol 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: smart-notifier mailx | mailutils 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout:Recommended packages: 2026-03-08T22:42:06.963 INFO:teuthology.orchestra.run.vm06.stdout: btrfs-tools 2026-03-08T22:42:07.001 INFO:teuthology.orchestra.run.vm06.stdout:The following NEW packages will be installed: 2026-03-08T22:42:07.001 INFO:teuthology.orchestra.run.vm06.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-08T22:42:07.001 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-08T22:42:07.001 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-08T22:42:07.001 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-dev libcephfs2 libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: liblua5.3-dev libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 libreadline-dev 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 libsqlite3-mod-ceph libthrift-0.16.0 lua-any lua-sec lua-socket 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: lua5.1 luarocks nvme-cli pkg-config python-asyncssh-doc 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:42:07.002 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-pytest python3-rados python3-rbd 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze.lru python3-requests-oauthlib python3-rgw python3-routes 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-threadpoolctl python3-toml python3-waitress python3-wcwidth 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse smartmontools 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: socat unzip xmlstarlet zip 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be upgraded: 2026-03-08T22:42:07.003 INFO:teuthology.orchestra.run.vm06.stdout: librados2 librbd1 2026-03-08T22:42:07.256 INFO:teuthology.orchestra.run.vm06.stdout:2 upgraded, 107 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:42:07.256 INFO:teuthology.orchestra.run.vm06.stdout:Need to get 178 MB of archives. 2026-03-08T22:42:07.256 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 782 MB of additional disk space will be used. 2026-03-08T22:42:07.256 INFO:teuthology.orchestra.run.vm06.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-08T22:42:07.290 INFO:teuthology.orchestra.run.vm06.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-08T22:42:07.291 INFO:teuthology.orchestra.run.vm06.stdout:Get:3 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-08T22:42:07.299 INFO:teuthology.orchestra.run.vm06.stdout:Get:4 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-08T22:42:07.326 INFO:teuthology.orchestra.run.vm06.stdout:Get:5 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-08T22:42:07.327 INFO:teuthology.orchestra.run.vm06.stdout:Get:6 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-08T22:42:07.344 INFO:teuthology.orchestra.run.vm06.stdout:Get:7 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-08T22:42:07.345 INFO:teuthology.orchestra.run.vm06.stdout:Get:8 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-08T22:42:07.346 INFO:teuthology.orchestra.run.vm06.stdout:Get:9 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-08T22:42:07.346 INFO:teuthology.orchestra.run.vm06.stdout:Get:10 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-08T22:42:07.346 INFO:teuthology.orchestra.run.vm06.stdout:Get:11 https://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-08T22:42:07.349 INFO:teuthology.orchestra.run.vm06.stdout:Get:12 https://archive.ubuntu.com/ubuntu jammy/main amd64 libreadline-dev amd64 8.1.2-1 [166 kB] 2026-03-08T22:42:07.350 INFO:teuthology.orchestra.run.vm06.stdout:Get:13 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblua5.3-dev amd64 5.3.6-1build1 [167 kB] 2026-03-08T22:42:07.351 INFO:teuthology.orchestra.run.vm06.stdout:Get:14 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua5.1 amd64 5.1.5-8.1build4 [94.6 kB] 2026-03-08T22:42:07.351 INFO:teuthology.orchestra.run.vm06.stdout:Get:15 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-any all 27ubuntu1 [5034 B] 2026-03-08T22:42:07.356 INFO:teuthology.orchestra.run.vm06.stdout:Get:16 https://archive.ubuntu.com/ubuntu jammy/main amd64 zip amd64 3.0-12build2 [176 kB] 2026-03-08T22:42:07.358 INFO:teuthology.orchestra.run.vm06.stdout:Get:17 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 unzip amd64 6.0-26ubuntu3.2 [175 kB] 2026-03-08T22:42:07.359 INFO:teuthology.orchestra.run.vm06.stdout:Get:18 https://archive.ubuntu.com/ubuntu jammy/universe amd64 luarocks all 3.8.0+dfsg1-1 [140 kB] 2026-03-08T22:42:07.361 INFO:teuthology.orchestra.run.vm06.stdout:Get:19 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-08T22:42:07.361 INFO:teuthology.orchestra.run.vm06.stdout:Get:20 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-08T22:42:07.364 INFO:teuthology.orchestra.run.vm06.stdout:Get:21 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-08T22:42:07.364 INFO:teuthology.orchestra.run.vm06.stdout:Get:22 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-08T22:42:07.365 INFO:teuthology.orchestra.run.vm06.stdout:Get:23 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-08T22:42:07.365 INFO:teuthology.orchestra.run.vm06.stdout:Get:24 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-08T22:42:07.365 INFO:teuthology.orchestra.run.vm06.stdout:Get:25 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-08T22:42:07.371 INFO:teuthology.orchestra.run.vm06.stdout:Get:26 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-08T22:42:07.371 INFO:teuthology.orchestra.run.vm06.stdout:Get:27 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-08T22:42:07.371 INFO:teuthology.orchestra.run.vm06.stdout:Get:28 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-08T22:42:07.373 INFO:teuthology.orchestra.run.vm06.stdout:Get:29 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-08T22:42:07.374 INFO:teuthology.orchestra.run.vm06.stdout:Get:30 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-logutils all 0.3.3-8 [17.6 kB] 2026-03-08T22:42:07.378 INFO:teuthology.orchestra.run.vm06.stdout:Get:31 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-mako all 1.1.3+ds1-2ubuntu0.1 [60.5 kB] 2026-03-08T22:42:07.379 INFO:teuthology.orchestra.run.vm06.stdout:Get:32 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplegeneric all 0.8.1-3 [11.3 kB] 2026-03-08T22:42:07.379 INFO:teuthology.orchestra.run.vm06.stdout:Get:33 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-singledispatch all 3.4.0.3-3 [7320 B] 2026-03-08T22:42:07.379 INFO:teuthology.orchestra.run.vm06.stdout:Get:34 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-08T22:42:07.380 INFO:teuthology.orchestra.run.vm06.stdout:Get:35 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-waitress all 1.4.4-1.1ubuntu1.1 [47.0 kB] 2026-03-08T22:42:07.386 INFO:teuthology.orchestra.run.vm06.stdout:Get:36 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempita all 0.5.2-6ubuntu1 [15.1 kB] 2026-03-08T22:42:07.386 INFO:teuthology.orchestra.run.vm06.stdout:Get:37 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-paste all 3.5.0+dfsg1-1 [456 kB] 2026-03-08T22:42:07.390 INFO:teuthology.orchestra.run.vm06.stdout:Get:38 https://archive.ubuntu.com/ubuntu jammy/main amd64 python-pastedeploy-tpl all 2.1.1-1 [4892 B] 2026-03-08T22:42:07.391 INFO:teuthology.orchestra.run.vm06.stdout:Get:39 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastedeploy all 2.1.1-1 [26.6 kB] 2026-03-08T22:42:07.391 INFO:teuthology.orchestra.run.vm06.stdout:Get:40 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-webtest all 2.0.35-1 [28.5 kB] 2026-03-08T22:42:07.393 INFO:teuthology.orchestra.run.vm06.stdout:Get:41 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pecan all 1.3.3-4ubuntu2 [87.3 kB] 2026-03-08T22:42:07.394 INFO:teuthology.orchestra.run.vm06.stdout:Get:42 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-werkzeug all 2.0.2+dfsg1-1ubuntu0.22.04.3 [181 kB] 2026-03-08T22:42:07.396 INFO:teuthology.orchestra.run.vm06.stdout:Get:43 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-08T22:42:07.397 INFO:teuthology.orchestra.run.vm06.stdout:Get:44 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-08T22:42:07.398 INFO:teuthology.orchestra.run.vm06.stdout:Get:45 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-08T22:42:07.401 INFO:teuthology.orchestra.run.vm06.stdout:Get:46 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-08T22:42:07.402 INFO:teuthology.orchestra.run.vm06.stdout:Get:47 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-08T22:42:07.444 INFO:teuthology.orchestra.run.vm06.stdout:Get:48 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-08T22:42:07.444 INFO:teuthology.orchestra.run.vm06.stdout:Get:49 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-08T22:42:07.445 INFO:teuthology.orchestra.run.vm06.stdout:Get:50 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-08T22:42:07.453 INFO:teuthology.orchestra.run.vm06.stdout:Get:51 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-08T22:42:07.453 INFO:teuthology.orchestra.run.vm06.stdout:Get:52 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-08T22:42:07.454 INFO:teuthology.orchestra.run.vm06.stdout:Get:53 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-08T22:42:07.454 INFO:teuthology.orchestra.run.vm06.stdout:Get:54 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-08T22:42:07.454 INFO:teuthology.orchestra.run.vm06.stdout:Get:55 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-08T22:42:07.454 INFO:teuthology.orchestra.run.vm06.stdout:Get:56 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-08T22:42:07.456 INFO:teuthology.orchestra.run.vm06.stdout:Get:57 https://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-08T22:42:07.459 INFO:teuthology.orchestra.run.vm06.stdout:Get:58 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-08T22:42:07.461 INFO:teuthology.orchestra.run.vm06.stdout:Get:59 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-08T22:42:07.464 INFO:teuthology.orchestra.run.vm06.stdout:Get:60 https://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-08T22:42:07.467 INFO:teuthology.orchestra.run.vm06.stdout:Get:61 https://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-08T22:42:07.470 INFO:teuthology.orchestra.run.vm06.stdout:Get:62 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-socket amd64 3.0~rc1+git+ac3201d-6 [78.9 kB] 2026-03-08T22:42:07.471 INFO:teuthology.orchestra.run.vm06.stdout:Get:63 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-sec amd64 1.0.2-1 [37.6 kB] 2026-03-08T22:42:07.471 INFO:teuthology.orchestra.run.vm06.stdout:Get:64 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-08T22:42:07.476 INFO:teuthology.orchestra.run.vm06.stdout:Get:65 https://archive.ubuntu.com/ubuntu jammy/main amd64 pkg-config amd64 0.29.2-1ubuntu3 [48.2 kB] 2026-03-08T22:42:07.476 INFO:teuthology.orchestra.run.vm06.stdout:Get:66 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-08T22:42:07.479 INFO:teuthology.orchestra.run.vm06.stdout:Get:67 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-08T22:42:07.479 INFO:teuthology.orchestra.run.vm06.stdout:Get:68 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastescript all 2.0.2-4 [54.6 kB] 2026-03-08T22:42:07.479 INFO:teuthology.orchestra.run.vm06.stdout:Get:69 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-08T22:42:07.480 INFO:teuthology.orchestra.run.vm06.stdout:Get:70 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-08T22:42:07.486 INFO:teuthology.orchestra.run.vm06.stdout:Get:71 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-08T22:42:07.487 INFO:teuthology.orchestra.run.vm06.stdout:Get:72 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-08T22:42:07.494 INFO:teuthology.orchestra.run.vm06.stdout:Get:73 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pyinotify all 0.9.6-1.3 [24.8 kB] 2026-03-08T22:42:07.494 INFO:teuthology.orchestra.run.vm06.stdout:Get:74 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-08T22:42:07.494 INFO:teuthology.orchestra.run.vm06.stdout:Get:75 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-08T22:42:07.496 INFO:teuthology.orchestra.run.vm06.stdout:Get:76 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-08T22:42:07.496 INFO:teuthology.orchestra.run.vm06.stdout:Get:77 https://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-08T22:42:07.515 INFO:teuthology.orchestra.run.vm06.stdout:Get:78 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-08T22:42:07.607 INFO:teuthology.orchestra.run.vm06.stdout:Get:79 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 19.2.3-678-ge911bdeb-1jammy [3257 kB] 2026-03-08T22:42:08.432 INFO:teuthology.orchestra.run.vm06.stdout:Get:80 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 19.2.3-678-ge911bdeb-1jammy [3597 kB] 2026-03-08T22:42:08.557 INFO:teuthology.orchestra.run.vm06.stdout:Get:81 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 19.2.3-678-ge911bdeb-1jammy [979 kB] 2026-03-08T22:42:08.570 INFO:teuthology.orchestra.run.vm06.stdout:Get:82 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 19.2.3-678-ge911bdeb-1jammy [357 kB] 2026-03-08T22:42:08.574 INFO:teuthology.orchestra.run.vm06.stdout:Get:83 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 19.2.3-678-ge911bdeb-1jammy [32.9 kB] 2026-03-08T22:42:08.574 INFO:teuthology.orchestra.run.vm06.stdout:Get:84 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 19.2.3-678-ge911bdeb-1jammy [184 kB] 2026-03-08T22:42:08.578 INFO:teuthology.orchestra.run.vm06.stdout:Get:85 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 19.2.3-678-ge911bdeb-1jammy [70.1 kB] 2026-03-08T22:42:08.579 INFO:teuthology.orchestra.run.vm06.stdout:Get:86 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 19.2.3-678-ge911bdeb-1jammy [334 kB] 2026-03-08T22:42:08.585 INFO:teuthology.orchestra.run.vm06.stdout:Get:87 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 19.2.3-678-ge911bdeb-1jammy [6935 kB] 2026-03-08T22:42:08.908 INFO:teuthology.orchestra.run.vm06.stdout:Get:88 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 19.2.3-678-ge911bdeb-1jammy [112 kB] 2026-03-08T22:42:08.911 INFO:teuthology.orchestra.run.vm06.stdout:Get:89 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 19.2.3-678-ge911bdeb-1jammy [470 kB] 2026-03-08T22:42:08.915 INFO:teuthology.orchestra.run.vm06.stdout:Get:90 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 19.2.3-678-ge911bdeb-1jammy [26.5 MB] 2026-03-08T22:42:10.031 INFO:teuthology.orchestra.run.vm06.stdout:Get:91 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 19.2.3-678-ge911bdeb-1jammy [5178 kB] 2026-03-08T22:42:10.255 INFO:teuthology.orchestra.run.vm06.stdout:Get:92 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 19.2.3-678-ge911bdeb-1jammy [248 kB] 2026-03-08T22:42:10.256 INFO:teuthology.orchestra.run.vm06.stdout:Get:93 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 19.2.3-678-ge911bdeb-1jammy [125 kB] 2026-03-08T22:42:10.257 INFO:teuthology.orchestra.run.vm06.stdout:Get:94 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 19.2.3-678-ge911bdeb-1jammy [1081 kB] 2026-03-08T22:42:10.272 INFO:teuthology.orchestra.run.vm06.stdout:Get:95 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 19.2.3-678-ge911bdeb-1jammy [6239 kB] 2026-03-08T22:42:10.510 INFO:teuthology.orchestra.run.vm06.stdout:Get:96 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 19.2.3-678-ge911bdeb-1jammy [23.0 MB] 2026-03-08T22:42:11.419 INFO:teuthology.orchestra.run.vm06.stdout:Get:97 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 19.2.3-678-ge911bdeb-1jammy [14.2 kB] 2026-03-08T22:42:11.419 INFO:teuthology.orchestra.run.vm06.stdout:Get:98 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 19.2.3-678-ge911bdeb-1jammy [1173 kB] 2026-03-08T22:42:11.438 INFO:teuthology.orchestra.run.vm06.stdout:Get:99 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 19.2.3-678-ge911bdeb-1jammy [2503 kB] 2026-03-08T22:42:11.548 INFO:teuthology.orchestra.run.vm06.stdout:Get:100 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 19.2.3-678-ge911bdeb-1jammy [798 kB] 2026-03-08T22:42:11.562 INFO:teuthology.orchestra.run.vm06.stdout:Get:101 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 19.2.3-678-ge911bdeb-1jammy [157 kB] 2026-03-08T22:42:11.567 INFO:teuthology.orchestra.run.vm06.stdout:Get:102 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 19.2.3-678-ge911bdeb-1jammy [2396 kB] 2026-03-08T22:42:11.670 INFO:teuthology.orchestra.run.vm06.stdout:Get:103 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 19.2.3-678-ge911bdeb-1jammy [8625 kB] 2026-03-08T22:42:12.000 INFO:teuthology.orchestra.run.vm06.stdout:Get:104 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 19.2.3-678-ge911bdeb-1jammy [14.3 kB] 2026-03-08T22:42:12.000 INFO:teuthology.orchestra.run.vm06.stdout:Get:105 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 19.2.3-678-ge911bdeb-1jammy [52.1 MB] 2026-03-08T22:42:13.970 INFO:teuthology.orchestra.run.vm06.stdout:Get:106 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 19.2.3-678-ge911bdeb-1jammy [135 kB] 2026-03-08T22:42:13.970 INFO:teuthology.orchestra.run.vm06.stdout:Get:107 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 19.2.3-678-ge911bdeb-1jammy [41.0 kB] 2026-03-08T22:42:13.970 INFO:teuthology.orchestra.run.vm06.stdout:Get:108 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 19.2.3-678-ge911bdeb-1jammy [13.7 MB] 2026-03-08T22:42:14.596 INFO:teuthology.orchestra.run.vm06.stdout:Get:109 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 19.2.3-678-ge911bdeb-1jammy [92.2 kB] 2026-03-08T22:42:14.914 INFO:teuthology.orchestra.run.vm06.stdout:Fetched 178 MB in 8s (23.5 MB/s) 2026-03-08T22:42:15.204 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-08T22:42:15.235 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 111717 files and directories currently installed.) 2026-03-08T22:42:15.237 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../000-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-08T22:42:15.266 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:42:15.286 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-08T22:42:15.292 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../001-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-08T22:42:15.292 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:42:15.307 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-08T22:42:15.313 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../002-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-08T22:42:15.314 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:42:15.335 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-08T22:42:15.341 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../003-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:42:15.345 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:42:15.439 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-08T22:42:15.445 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../004-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:42:15.449 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:42:15.484 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-08T22:42:15.489 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../005-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:42:15.493 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:42:15.535 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-08T22:42:15.540 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../006-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-08T22:42:15.541 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:42:15.574 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../007-librbd1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:15.581 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking librbd1 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-08T22:42:15.678 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../008-librados2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:15.686 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking librados2 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-08T22:42:15.782 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libnbd0. 2026-03-08T22:42:15.787 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../009-libnbd0_1.10.5-1_amd64.deb ... 2026-03-08T22:42:15.788 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-08T22:42:16.042 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libcephfs2. 2026-03-08T22:42:16.048 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../010-libcephfs2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.051 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.092 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-rados. 2026-03-08T22:42:16.098 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../011-python3-rados_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.101 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.136 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-08T22:42:16.141 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../012-python3-ceph-argparse_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:16.143 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.171 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-cephfs. 2026-03-08T22:42:16.176 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../013-python3-cephfs_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.180 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.209 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-08T22:42:16.214 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../014-python3-ceph-common_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:16.216 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.244 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-08T22:42:16.249 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../015-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-08T22:42:16.252 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:42:16.282 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-prettytable. 2026-03-08T22:42:16.287 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../016-python3-prettytable_2.5.0-2_all.deb ... 2026-03-08T22:42:16.289 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-08T22:42:16.316 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-rbd. 2026-03-08T22:42:16.321 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../017-python3-rbd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.325 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.358 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-08T22:42:16.363 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../018-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-08T22:42:16.365 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:42:16.395 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libreadline-dev:amd64. 2026-03-08T22:42:16.401 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../019-libreadline-dev_8.1.2-1_amd64.deb ... 2026-03-08T22:42:16.403 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:42:16.431 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package liblua5.3-dev:amd64. 2026-03-08T22:42:16.438 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../020-liblua5.3-dev_5.3.6-1build1_amd64.deb ... 2026-03-08T22:42:16.440 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:42:16.483 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package lua5.1. 2026-03-08T22:42:16.490 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../021-lua5.1_5.1.5-8.1build4_amd64.deb ... 2026-03-08T22:42:16.491 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:42:16.518 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package lua-any. 2026-03-08T22:42:16.524 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../022-lua-any_27ubuntu1_all.deb ... 2026-03-08T22:42:16.529 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking lua-any (27ubuntu1) ... 2026-03-08T22:42:16.556 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package zip. 2026-03-08T22:42:16.563 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../023-zip_3.0-12build2_amd64.deb ... 2026-03-08T22:42:16.564 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking zip (3.0-12build2) ... 2026-03-08T22:42:16.596 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package unzip. 2026-03-08T22:42:16.602 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../024-unzip_6.0-26ubuntu3.2_amd64.deb ... 2026-03-08T22:42:16.603 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:42:16.624 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package luarocks. 2026-03-08T22:42:16.629 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../025-luarocks_3.8.0+dfsg1-1_all.deb ... 2026-03-08T22:42:16.630 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:42:16.685 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package librgw2. 2026-03-08T22:42:16.691 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../026-librgw2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.692 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.814 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-rgw. 2026-03-08T22:42:16.819 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../027-python3-rgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.820 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.839 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-08T22:42:16.844 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../028-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-08T22:42:16.845 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:42:16.863 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libradosstriper1. 2026-03-08T22:42:16.869 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../029-libradosstriper1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.869 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:16.894 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-common. 2026-03-08T22:42:16.898 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../030-ceph-common_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:16.899 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:17.371 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-base. 2026-03-08T22:42:17.377 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../031-ceph-base_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:17.382 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:17.498 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-08T22:42:17.503 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../032-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-08T22:42:17.504 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:42:17.518 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-cheroot. 2026-03-08T22:42:17.523 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../033-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-08T22:42:17.524 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:42:17.593 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-08T22:42:17.598 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../034-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-08T22:42:17.599 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:42:17.613 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-08T22:42:17.618 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../035-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-08T22:42:17.619 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:42:17.635 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-08T22:42:17.641 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../036-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-08T22:42:17.642 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:42:17.659 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-tempora. 2026-03-08T22:42:17.666 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../037-python3-tempora_4.1.2-1_all.deb ... 2026-03-08T22:42:17.667 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-08T22:42:17.684 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-portend. 2026-03-08T22:42:17.689 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../038-python3-portend_3.0.0-1_all.deb ... 2026-03-08T22:42:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-08T22:42:17.704 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-08T22:42:17.710 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../039-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-08T22:42:17.711 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-08T22:42:17.728 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-08T22:42:17.734 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../040-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-08T22:42:17.735 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:42:17.765 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-natsort. 2026-03-08T22:42:17.771 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../041-python3-natsort_8.0.2-1_all.deb ... 2026-03-08T22:42:17.772 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-08T22:42:17.792 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-logutils. 2026-03-08T22:42:17.800 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../042-python3-logutils_0.3.3-8_all.deb ... 2026-03-08T22:42:17.801 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-logutils (0.3.3-8) ... 2026-03-08T22:42:17.821 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-mako. 2026-03-08T22:42:17.829 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../043-python3-mako_1.1.3+ds1-2ubuntu0.1_all.deb ... 2026-03-08T22:42:17.831 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:42:17.853 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-simplegeneric. 2026-03-08T22:42:17.858 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../044-python3-simplegeneric_0.8.1-3_all.deb ... 2026-03-08T22:42:17.859 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:42:17.874 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-singledispatch. 2026-03-08T22:42:17.880 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../045-python3-singledispatch_3.4.0.3-3_all.deb ... 2026-03-08T22:42:17.881 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:42:17.896 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-webob. 2026-03-08T22:42:17.901 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../046-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-08T22:42:17.903 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:42:17.922 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-waitress. 2026-03-08T22:42:17.927 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../047-python3-waitress_1.4.4-1.1ubuntu1.1_all.deb ... 2026-03-08T22:42:17.930 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:42:18.075 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-tempita. 2026-03-08T22:42:18.081 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../048-python3-tempita_0.5.2-6ubuntu1_all.deb ... 2026-03-08T22:42:18.082 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:42:18.100 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-paste. 2026-03-08T22:42:18.106 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../049-python3-paste_3.5.0+dfsg1-1_all.deb ... 2026-03-08T22:42:18.109 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:42:18.146 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python-pastedeploy-tpl. 2026-03-08T22:42:18.152 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../050-python-pastedeploy-tpl_2.1.1-1_all.deb ... 2026-03-08T22:42:18.153 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:42:18.168 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pastedeploy. 2026-03-08T22:42:18.173 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../051-python3-pastedeploy_2.1.1-1_all.deb ... 2026-03-08T22:42:18.174 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:42:18.195 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-webtest. 2026-03-08T22:42:18.202 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../052-python3-webtest_2.0.35-1_all.deb ... 2026-03-08T22:42:18.203 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-webtest (2.0.35-1) ... 2026-03-08T22:42:18.224 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pecan. 2026-03-08T22:42:18.230 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../053-python3-pecan_1.3.3-4ubuntu2_all.deb ... 2026-03-08T22:42:18.231 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:42:18.266 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-werkzeug. 2026-03-08T22:42:18.272 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../054-python3-werkzeug_2.0.2+dfsg1-1ubuntu0.22.04.3_all.deb ... 2026-03-08T22:42:18.273 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:42:18.295 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-08T22:42:18.300 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../055-ceph-mgr-modules-core_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:18.300 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:18.332 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-08T22:42:18.337 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../056-libsqlite3-mod-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:18.338 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:18.357 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mgr. 2026-03-08T22:42:18.361 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../057-ceph-mgr_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:18.361 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:18.388 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mon. 2026-03-08T22:42:18.392 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../058-ceph-mon_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:18.394 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:18.499 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-08T22:42:18.505 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../059-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-08T22:42:18.506 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:42:18.528 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-osd. 2026-03-08T22:42:18.534 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../060-ceph-osd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:18.708 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.093 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph. 2026-03-08T22:42:19.098 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../061-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:19.099 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.120 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-fuse. 2026-03-08T22:42:19.125 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../062-ceph-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:19.127 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.164 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mds. 2026-03-08T22:42:19.169 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../063-ceph-mds_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:19.170 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.220 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package cephadm. 2026-03-08T22:42:19.225 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../064-cephadm_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:19.226 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.379 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-08T22:42:19.385 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../065-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-08T22:42:19.386 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:42:19.415 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-08T22:42:19.421 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../066-ceph-mgr-cephadm_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:19.422 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.473 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-08T22:42:19.477 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../067-python3-repoze.lru_0.7-2_all.deb ... 2026-03-08T22:42:19.478 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-08T22:42:19.497 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-routes. 2026-03-08T22:42:19.498 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../068-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-08T22:42:19.499 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:42:19.525 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-08T22:42:19.532 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../069-ceph-mgr-dashboard_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:19.533 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:19.963 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-08T22:42:19.969 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../070-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-08T22:42:19.970 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:42:20.037 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-joblib. 2026-03-08T22:42:20.042 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../071-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-08T22:42:20.043 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:42:20.079 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-08T22:42:20.085 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../072-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-08T22:42:20.085 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:42:20.105 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-sklearn. 2026-03-08T22:42:20.111 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../073-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-08T22:42:20.111 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:42:20.248 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-08T22:42:20.257 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../074-ceph-mgr-diskprediction-local_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:20.258 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:20.512 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-cachetools. 2026-03-08T22:42:20.515 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../075-python3-cachetools_5.0.0-1_all.deb ... 2026-03-08T22:42:20.516 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-08T22:42:20.532 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-rsa. 2026-03-08T22:42:20.536 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../076-python3-rsa_4.8-1_all.deb ... 2026-03-08T22:42:20.537 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-08T22:42:20.557 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-google-auth. 2026-03-08T22:42:20.562 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../077-python3-google-auth_1.5.1-3_all.deb ... 2026-03-08T22:42:20.583 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-08T22:42:20.601 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-08T22:42:20.606 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../078-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-08T22:42:20.607 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:42:20.622 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-websocket. 2026-03-08T22:42:20.626 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../079-python3-websocket_1.2.3-1_all.deb ... 2026-03-08T22:42:20.627 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-08T22:42:20.645 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-08T22:42:20.650 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../080-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-08T22:42:20.662 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:42:21.370 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-08T22:42:21.376 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../081-ceph-mgr-k8sevents_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:21.391 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:21.406 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-08T22:42:21.413 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../082-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-08T22:42:21.414 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:42:21.430 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-08T22:42:21.435 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../083-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-08T22:42:21.437 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:42:21.450 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package jq. 2026-03-08T22:42:21.454 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../084-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-08T22:42:21.455 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:42:21.468 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package socat. 2026-03-08T22:42:21.472 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../085-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-08T22:42:21.473 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:42:21.495 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package xmlstarlet. 2026-03-08T22:42:21.501 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../086-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-08T22:42:21.502 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:42:21.545 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-test. 2026-03-08T22:42:21.550 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../087-ceph-test_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:21.551 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:22.466 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package ceph-volume. 2026-03-08T22:42:22.472 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../088-ceph-volume_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:42:22.472 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:22.508 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-08T22:42:22.516 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../089-libcephfs-dev_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:22.516 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:22.533 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package lua-socket:amd64. 2026-03-08T22:42:22.539 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../090-lua-socket_3.0~rc1+git+ac3201d-6_amd64.deb ... 2026-03-08T22:42:22.540 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:42:22.563 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package lua-sec:amd64. 2026-03-08T22:42:22.569 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../091-lua-sec_1.0.2-1_amd64.deb ... 2026-03-08T22:42:22.569 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:42:22.587 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package nvme-cli. 2026-03-08T22:42:22.593 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../092-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-08T22:42:22.594 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:42:22.633 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package pkg-config. 2026-03-08T22:42:22.639 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../093-pkg-config_0.29.2-1ubuntu3_amd64.deb ... 2026-03-08T22:42:22.640 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:42:22.655 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-08T22:42:22.660 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../094-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-08T22:42:22.662 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:42:22.706 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-08T22:42:22.713 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../095-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-08T22:42:22.716 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-08T22:42:22.732 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pastescript. 2026-03-08T22:42:22.739 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../096-python3-pastescript_2.0.2-4_all.deb ... 2026-03-08T22:42:22.740 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pastescript (2.0.2-4) ... 2026-03-08T22:42:22.760 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pluggy. 2026-03-08T22:42:22.767 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../097-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-08T22:42:22.767 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-08T22:42:22.786 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-psutil. 2026-03-08T22:42:22.791 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../098-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-08T22:42:22.792 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-08T22:42:22.815 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-py. 2026-03-08T22:42:22.821 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../099-python3-py_1.10.0-1_all.deb ... 2026-03-08T22:42:22.822 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-08T22:42:22.851 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pygments. 2026-03-08T22:42:22.857 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../100-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-08T22:42:22.859 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-08T22:42:22.928 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pyinotify. 2026-03-08T22:42:22.934 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../101-python3-pyinotify_0.9.6-1.3_all.deb ... 2026-03-08T22:42:22.935 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:42:22.954 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-toml. 2026-03-08T22:42:22.960 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../102-python3-toml_0.10.2-1_all.deb ... 2026-03-08T22:42:22.961 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-08T22:42:22.980 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-pytest. 2026-03-08T22:42:22.985 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../103-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-08T22:42:22.987 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-08T22:42:23.018 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-simplejson. 2026-03-08T22:42:23.024 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../104-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-08T22:42:23.025 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:42:23.052 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-08T22:42:23.057 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../105-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-08T22:42:23.058 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:42:23.171 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package radosgw. 2026-03-08T22:42:23.179 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../106-radosgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:23.179 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:23.405 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package rbd-fuse. 2026-03-08T22:42:23.411 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../107-rbd-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:42:23.412 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:23.445 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package smartmontools. 2026-03-08T22:42:23.451 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../108-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-08T22:42:23.460 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:42:23.513 INFO:teuthology.orchestra.run.vm06.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:42:23.764 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-08T22:42:23.764 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-08T22:42:24.164 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-08T22:42:24.261 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:42:24.263 INFO:teuthology.orchestra.run.vm06.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:42:24.336 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-08T22:42:24.592 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-08T22:42:24.934 INFO:teuthology.orchestra.run.vm06.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-08T22:42:24.941 INFO:teuthology.orchestra.run.vm06.stdout:Could not execute systemctl: at /usr/bin/deb-systemd-invoke line 142. 2026-03-08T22:42:24.947 INFO:teuthology.orchestra.run.vm06.stdout:Setting up cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:25.017 INFO:teuthology.orchestra.run.vm06.stdout:Adding system user cephadm....done 2026-03-08T22:42:25.033 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:42:25.112 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:42:25.192 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:42:25.198 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:42:25.265 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-08T22:42:25.347 INFO:teuthology.orchestra.run.vm06.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:42:25.355 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-08T22:42:25.456 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:42:25.582 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-08T22:42:25.657 INFO:teuthology.orchestra.run.vm06.stdout:Setting up unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:42:25.666 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:42:25.744 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:42:25.822 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:25.904 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:42:25.906 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-08T22:42:25.909 INFO:teuthology.orchestra.run.vm06.stdout:Setting up lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:42:25.911 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:42:25.914 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:42:25.916 INFO:teuthology.orchestra.run.vm06.stdout:Setting up lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:42:25.921 INFO:teuthology.orchestra.run.vm06.stdout:update-alternatives: using /usr/bin/lua5.1 to provide /usr/bin/lua (lua-interpreter) in auto mode 2026-03-08T22:42:25.924 INFO:teuthology.orchestra.run.vm06.stdout:update-alternatives: using /usr/bin/luac5.1 to provide /usr/bin/luac (lua-compiler) in auto mode 2026-03-08T22:42:25.926 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:42:25.928 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-08T22:42:26.067 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-08T22:42:26.145 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:42:26.220 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:42:26.309 INFO:teuthology.orchestra.run.vm06.stdout:Setting up zip (3.0-12build2) ... 2026-03-08T22:42:26.311 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-08T22:42:26.605 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:42:26.683 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:42:26.686 INFO:teuthology.orchestra.run.vm06.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:42:26.688 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:42:26.784 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:42:26.927 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:42:27.073 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:42:27.170 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:42:27.288 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:42:27.359 INFO:teuthology.orchestra.run.vm06.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:42:27.362 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:27.456 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:42:28.022 INFO:teuthology.orchestra.run.vm06.stdout:Setting up pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:42:28.045 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:42:28.051 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-08T22:42:28.124 INFO:teuthology.orchestra.run.vm06.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:42:28.127 INFO:teuthology.orchestra.run.vm06.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:42:28.130 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-08T22:42:28.205 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-08T22:42:28.276 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:42:28.278 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-08T22:42:28.349 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:42:28.418 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-logutils (0.3.3-8) ... 2026-03-08T22:42:28.489 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-08T22:42:28.556 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:42:28.628 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-08T22:42:28.703 INFO:teuthology.orchestra.run.vm06.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:42:28.706 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-08T22:42:28.783 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:42:28.786 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:42:28.861 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:42:28.952 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:42:29.048 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:42:29.118 INFO:teuthology.orchestra.run.vm06.stdout:Setting up liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:42:29.121 INFO:teuthology.orchestra.run.vm06.stdout:Setting up lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:42:29.123 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:42:29.126 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-08T22:42:29.265 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:42:29.338 INFO:teuthology.orchestra.run.vm06.stdout:Setting up lua-any (27ubuntu1) ... 2026-03-08T22:42:29.341 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-08T22:42:29.411 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:42:29.413 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-08T22:42:29.492 INFO:teuthology.orchestra.run.vm06.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:42:29.495 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-webtest (2.0.35-1) ... 2026-03-08T22:42:29.608 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:42:29.741 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pastescript (2.0.2-4) ... 2026-03-08T22:42:29.828 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:42:29.938 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:42:29.940 INFO:teuthology.orchestra.run.vm06.stdout:Setting up librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:29.943 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:29.945 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:42:30.528 INFO:teuthology.orchestra.run.vm06.stdout:Setting up luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:42:30.536 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.538 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.541 INFO:teuthology.orchestra.run.vm06.stdout:Setting up librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.543 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.546 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.608 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-08T22:42:30.608 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-08T22:42:30.968 INFO:teuthology.orchestra.run.vm06.stdout:Setting up libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.976 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.981 INFO:teuthology.orchestra.run.vm06.stdout:Setting up librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.985 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.992 INFO:teuthology.orchestra.run.vm06.stdout:Setting up rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:30.999 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:31.002 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:31.010 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:31.057 INFO:teuthology.orchestra.run.vm06.stdout:Adding group ceph....done 2026-03-08T22:42:31.129 INFO:teuthology.orchestra.run.vm06.stdout:Adding system user ceph....done 2026-03-08T22:42:31.146 INFO:teuthology.orchestra.run.vm06.stdout:Setting system user ceph properties....done 2026-03-08T22:42:31.150 INFO:teuthology.orchestra.run.vm06.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-08T22:42:31.216 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-08T22:42:31.479 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-08T22:42:31.844 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:31.847 INFO:teuthology.orchestra.run.vm06.stdout:Setting up radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:32.068 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:42:32.068 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:42:32.423 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:32.508 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-08T22:42:32.846 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:32.912 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-08T22:42:32.912 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-08T22:42:33.284 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:33.347 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-08T22:42:33.347 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-08T22:42:33.751 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:33.842 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-08T22:42:33.843 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-08T22:42:34.260 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:34.263 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:34.277 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:34.340 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-08T22:42:34.340 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-08T22:42:34.899 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:35.264 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:35.266 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:35.280 INFO:teuthology.orchestra.run.vm06.stdout:Setting up ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:42:35.399 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-08T22:42:35.458 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:42:35.476 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:42:35.557 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for install-info (6.8-4build1) ... 2026-03-08T22:42:35.922 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:35.922 INFO:teuthology.orchestra.run.vm06.stdout:Running kernel seems to be up-to-date. 2026-03-08T22:42:35.922 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:35.922 INFO:teuthology.orchestra.run.vm06.stdout:Services to be restarted: 2026-03-08T22:42:35.925 INFO:teuthology.orchestra.run.vm06.stdout: systemctl restart packagekit.service 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout:Service restarts being deferred: 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout: systemctl restart unattended-upgrades.service 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout:No containers need to be restarted. 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout:No user sessions are running outdated binaries. 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:35.928 INFO:teuthology.orchestra.run.vm06.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-08T22:42:36.795 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:42:36.798 DEBUG:teuthology.orchestra.run.vm06:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-xmltodict python3-jmespath 2026-03-08T22:42:36.874 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:42:37.058 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:42:37.059 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:42:37.211 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:42:37.211 INFO:teuthology.orchestra.run.vm06.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:42:37.211 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:42:37.211 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:42:37.222 INFO:teuthology.orchestra.run.vm06.stdout:The following NEW packages will be installed: 2026-03-08T22:42:37.222 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath python3-xmltodict 2026-03-08T22:42:37.680 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 2 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:42:37.680 INFO:teuthology.orchestra.run.vm06.stdout:Need to get 34.3 kB of archives. 2026-03-08T22:42:37.681 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 146 kB of additional disk space will be used. 2026-03-08T22:42:37.681 INFO:teuthology.orchestra.run.vm06.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-08T22:42:37.906 INFO:teuthology.orchestra.run.vm06.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-08T22:42:38.124 INFO:teuthology.orchestra.run.vm06.stdout:Fetched 34.3 kB in 1s (49.5 kB/s) 2026-03-08T22:42:38.306 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-jmespath. 2026-03-08T22:42:38.339 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118577 files and directories currently installed.) 2026-03-08T22:42:38.342 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-08T22:42:38.422 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-08T22:42:38.440 INFO:teuthology.orchestra.run.vm06.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-08T22:42:38.446 INFO:teuthology.orchestra.run.vm06.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-08T22:42:38.446 INFO:teuthology.orchestra.run.vm06.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-08T22:42:38.474 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-08T22:42:38.539 INFO:teuthology.orchestra.run.vm06.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-08T22:42:38.952 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:38.953 INFO:teuthology.orchestra.run.vm06.stdout:Running kernel seems to be up-to-date. 2026-03-08T22:42:38.953 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:38.953 INFO:teuthology.orchestra.run.vm06.stdout:Services to be restarted: 2026-03-08T22:42:38.956 INFO:teuthology.orchestra.run.vm06.stdout: systemctl restart packagekit.service 2026-03-08T22:42:38.959 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:38.965 INFO:teuthology.orchestra.run.vm06.stdout:Service restarts being deferred: 2026-03-08T22:42:38.965 INFO:teuthology.orchestra.run.vm06.stdout: systemctl restart unattended-upgrades.service 2026-03-08T22:42:38.965 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:38.965 INFO:teuthology.orchestra.run.vm06.stdout:No containers need to be restarted. 2026-03-08T22:42:38.965 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:38.965 INFO:teuthology.orchestra.run.vm06.stdout:No user sessions are running outdated binaries. 2026-03-08T22:42:38.966 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T22:42:38.966 INFO:teuthology.orchestra.run.vm06.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-08T22:42:39.827 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:42:39.832 DEBUG:teuthology.parallel:result is None 2026-03-08T22:42:39.832 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:42:40.439 DEBUG:teuthology.orchestra.run.vm06:> dpkg-query -W -f '${Version}' ceph 2026-03-08T22:42:40.448 INFO:teuthology.orchestra.run.vm06.stdout:19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:42:40.449 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:42:40.449 INFO:teuthology.task.install:The correct ceph version 19.2.3-678-ge911bdeb-1jammy is installed. 2026-03-08T22:42:40.451 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T22:42:40.451 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:42:40.451 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T22:42:40.501 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T22:42:40.501 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:42:40.501 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T22:42:40.553 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T22:42:40.604 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T22:42:40.604 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:42:40.604 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T22:42:40.656 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T22:42:40.707 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T22:42:40.708 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:42:40.708 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T22:42:40.756 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T22:42:40.804 INFO:teuthology.run_tasks:Running task workunit... 2026-03-08T22:42:40.830 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:42:40.830 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-08T22:42:40.830 INFO:tasks.workunit:timeout=3h 2026-03-08T22:42:40.830 INFO:tasks.workunit:cleanup=True 2026-03-08T22:42:40.830 DEBUG:teuthology.orchestra.run.vm06:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:42:40.851 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:42:40.851 INFO:teuthology.orchestra.run.vm06.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-08T22:42:40.851 DEBUG:teuthology.orchestra.run.vm06:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:42:40.895 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-08T22:42:40.895 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-08T22:42:40.939 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:42:40.984 INFO:tasks.workunit.client.0.vm06.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-08T22:43:39.335 INFO:tasks.workunit.client.0.vm06.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:state without impacting any branches by switching back to a branch. 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: git switch -c 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:Or undo this operation with: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: git switch - 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:43:39.336 INFO:tasks.workunit.client.0.vm06.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-08T22:43:39.342 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-08T22:43:39.389 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T22:43:39.389 DEBUG:teuthology.orchestra.run.vm06:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-08T22:43:39.435 INFO:tasks.workunit:Running workunits matching osd-backfill on client.0... 2026-03-08T22:43:39.436 INFO:tasks.workunit:Running workunit osd-backfill/osd-backfill-prio.sh... 2026-03-08T22:43:39.436 DEBUG:teuthology.orchestra.run.vm06:workunit test osd-backfill/osd-backfill-prio.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh 2026-03-08T22:43:39.483 INFO:tasks.workunit.client.0.vm06.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:43:39.487 INFO:tasks.workunit.client.0.vm06.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:43:39.487 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:43:39.487 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-backfill-prio 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:21: run: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:22: run: shift 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:25: run: export CEPH_MON=127.0.0.1:7114 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:25: run: CEPH_MON=127.0.0.1:7114 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:26: run: export CEPH_ARGS 2026-03-08T22:43:39.488 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:27: run: uuidgen 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:27: run: CEPH_ARGS+='--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none ' 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:28: run: CEPH_ARGS+='--mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 ' 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:29: run: CEPH_ARGS+='--osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 ' 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:32: run: CEPH_ARGS+='--osd-op-queue=wpq ' 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:33: run: export objects=50 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:33: run: objects=50 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:34: run: export poolprefix=test 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:34: run: poolprefix=test 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:35: run: export FORCE_PRIO=254 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:35: run: FORCE_PRIO=254 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:36: run: export DEGRADED_PRIO=150 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:36: run: DEGRADED_PRIO=150 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:37: run: export NORMAL_PRIO=110 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:37: run: NORMAL_PRIO=110 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:39: run: set 2026-03-08T22:43:39.489 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:39: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:39: run: local 'funcs=TEST_backfill_pool_priority 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:TEST_backfill_priority' 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:40: run: for func in $funcs 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:41: run: setup td/osd-backfill-prio 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-backfill-prio 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:43:39.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-prio KILL 2026-03-08T22:43:39.491 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:43:39.491 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:43:39.491 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:43:39.491 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:43:39.491 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:43:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:43:39.492 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:43:39.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:43:39.493 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:43:39.494 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:43:39.494 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:43:39.494 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:43:39.495 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:43:39.495 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:43:39.495 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:43:39.495 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:43:39.496 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:43:39.497 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:43:39.497 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-prio 2026-03-08T22:43:39.497 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:43:39.497 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.497 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19257 2026-03-08T22:43:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:43:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:43:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-backfill-prio 2026-03-08T22:43:39.499 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:43:39.499 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.499 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.499 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.19257 2026-03-08T22:43:39.500 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-backfill-prio 1' TERM HUP INT 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:42: run: TEST_backfill_pool_priority td/osd-backfill-prio 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:329: TEST_backfill_pool_priority: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:330: TEST_backfill_pool_priority: local pools=3 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:331: TEST_backfill_pool_priority: local OSDS=2 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:333: TEST_backfill_pool_priority: run_mon td/osd-backfill-prio a 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-backfill-prio/a 2026-03-08T22:43:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-backfill-prio/a --run-dir=td/osd-backfill-prio 2026-03-08T22:43:39.620 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:43:39.620 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:39.620 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:39.620 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:39.620 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.620 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.621 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:43:39.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-backfill-prio/a '--log-file=td/osd-backfill-prio/$name.log' '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --mon-cluster-log-file=td/osd-backfill-prio/log --run-dir=td/osd-backfill-prio '--pid-file=td/osd-backfill-prio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:43:39.660 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:43:39.660 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:43:39.660 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:43:39.660 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:43:39.660 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.661 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.662 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-mon.a.asok 2026-03-08T22:43:39.664 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:43:39.664 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19257/ceph-mon.a.asok config get fsid 2026-03-08T22:43:39.731 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:43:39.731 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:43:39.731 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:43:39.731 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.732 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.733 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-mon.a.asok 2026-03-08T22:43:39.740 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:43:39.740 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19257/ceph-mon.a.asok config get mon_host 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:334: TEST_backfill_pool_priority: run_mgr td/osd-backfill-prio x 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-backfill-prio/x 2026-03-08T22:43:39.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:43:39.940 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:43:39.940 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:39.941 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:39.941 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:39.941 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.941 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.941 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:43:39.941 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:43:39.943 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-backfill-prio/x '--log-file=td/osd-backfill-prio/$name.log' '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --run-dir=td/osd-backfill-prio '--pid-file=td/osd-backfill-prio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:43:39.965 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:335: TEST_backfill_pool_priority: export CEPH_ARGS 2026-03-08T22:43:39.965 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:337: TEST_backfill_pool_priority: expr 2 - 1 2026-03-08T22:43:39.969 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:337: TEST_backfill_pool_priority: seq 0 1 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:337: TEST_backfill_pool_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:339: TEST_backfill_pool_priority: run_osd td/osd-backfill-prio 0 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/0 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/0' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/0/journal' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:39.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/0 2026-03-08T22:43:39.972 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:39.973 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7689003e-18e3-4ce3-a285-ccb085bd7764 2026-03-08T22:43:39.973 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 7689003e-18e3-4ce3-a285-ccb085bd7764' 2026-03-08T22:43:39.973 INFO:tasks.workunit.client.0.vm06.stdout:add osd0 7689003e-18e3-4ce3-a285-ccb085bd7764 2026-03-08T22:43:39.974 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:39.989 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCb+61pp97XOhAAUMcCZxYlKo0C8QYkdN5vfw== 2026-03-08T22:43:39.989 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCb+61pp97XOhAAUMcCZxYlKo0C8QYkdN5vfw=="}' 2026-03-08T22:43:39.990 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7689003e-18e3-4ce3-a285-ccb085bd7764 -i td/osd-backfill-prio/0/new.json 2026-03-08T22:43:40.128 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:43:40.140 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/0/new.json 2026-03-08T22:43:40.141 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/0 --osd-journal=td/osd-backfill-prio/0/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCb+61pp97XOhAAUMcCZxYlKo0C8QYkdN5vfw== --osd-uuid 7689003e-18e3-4ce3-a285-ccb085bd7764 2026-03-08T22:43:40.160 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:40.153+0000 7f714b5af8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:40.161 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:40.157+0000 7f714b5af8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:40.163 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:40.157+0000 7f714b5af8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:40.163 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:40.157+0000 7f714b5af8c0 -1 bdev(0x55944a246c00 td/osd-backfill-prio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:40.163 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:40.157+0000 7f714b5af8c0 -1 bluestore(td/osd-backfill-prio/0) _read_fsid unparsable uuid 2026-03-08T22:43:42.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/0/keyring 2026-03-08T22:43:42.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:42.491 INFO:tasks.workunit.client.0.vm06.stdout:adding osd0 key to auth repository 2026-03-08T22:43:42.491 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:43:42.491 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:42.619 INFO:tasks.workunit.client.0.vm06.stdout:start osd.0 2026-03-08T22:43:42.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:43:42.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/0 --osd-journal=td/osd-backfill-prio/0/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:42.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:42.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:42.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:42.674 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:42.661+0000 7f332cf728c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:42.674 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:42.669+0000 7f332cf728c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:42.684 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:42.677+0000 7f332cf728c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:42.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:42.780 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:43:42.903 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:43.649 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:43.645+0000 7f332cf728c0 -1 Falling back to public interface 2026-03-08T22:43:43.904 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:43.904 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:43.904 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:43:43.904 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:43.904 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:43.904 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:43:44.125 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:44.631 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:44.625+0000 7f332cf728c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:43:45.126 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:45.126 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:45.126 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:45.126 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:43:45.126 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:45.126 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:43:45.357 INFO:tasks.workunit.client.0.vm06.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2944527415,v1:127.0.0.1:6803/2944527415] [v2:127.0.0.1:6804/2944527415,v1:127.0.0.1:6805/2944527415] exists,up 7689003e-18e3-4ce3-a285-ccb085bd7764 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:337: TEST_backfill_pool_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:339: TEST_backfill_pool_priority: run_osd td/osd-backfill-prio 1 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/1 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/1' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/1/journal' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:45.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:45.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:45.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:45.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:45.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:45.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/1 2026-03-08T22:43:45.359 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:45.360 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=09a26b5f-ba58-4691-86d1-367c53d472a1 2026-03-08T22:43:45.360 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 09a26b5f-ba58-4691-86d1-367c53d472a1' 2026-03-08T22:43:45.360 INFO:tasks.workunit.client.0.vm06.stdout:add osd1 09a26b5f-ba58-4691-86d1-367c53d472a1 2026-03-08T22:43:45.360 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:45.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCh+61paCY6FhAAVXsaSbRuRsZ80oQkkA53Yg== 2026-03-08T22:43:45.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCh+61paCY6FhAAVXsaSbRuRsZ80oQkkA53Yg=="}' 2026-03-08T22:43:45.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 09a26b5f-ba58-4691-86d1-367c53d472a1 -i td/osd-backfill-prio/1/new.json 2026-03-08T22:43:45.609 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:43:45.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/1/new.json 2026-03-08T22:43:45.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/1 --osd-journal=td/osd-backfill-prio/1/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCh+61paCY6FhAAVXsaSbRuRsZ80oQkkA53Yg== --osd-uuid 09a26b5f-ba58-4691-86d1-367c53d472a1 2026-03-08T22:43:45.643 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:45.637+0000 7fcbe0c188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:45.645 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:45.641+0000 7fcbe0c188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:45.646 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:45.641+0000 7fcbe0c188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:45.647 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:45.641+0000 7fcbe0c188c0 -1 bdev(0x558ba2c6bc00 td/osd-backfill-prio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:45.647 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:45.641+0000 7fcbe0c188c0 -1 bluestore(td/osd-backfill-prio/1) _read_fsid unparsable uuid 2026-03-08T22:43:48.194 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/1/keyring 2026-03-08T22:43:48.194 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:48.195 INFO:tasks.workunit.client.0.vm06.stdout:adding osd1 key to auth repository 2026-03-08T22:43:48.195 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:43:48.195 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:48.502 INFO:tasks.workunit.client.0.vm06.stdout:start osd.1 2026-03-08T22:43:48.502 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:43:48.502 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/1 --osd-journal=td/osd-backfill-prio/1/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:48.502 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:48.502 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:48.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:48.520 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:48.513+0000 7fadcdb048c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:48.521 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:48.517+0000 7fadcdb048c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:48.523 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:48.517+0000 7fadcdb048c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:48.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:43:48.983 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:49.245 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:49.241+0000 7fadcdb048c0 -1 Falling back to public interface 2026-03-08T22:43:49.984 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:43:49.985 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:49.985 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:49.985 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:49.985 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:49.985 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:43:50.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:50.231 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:43:50.225+0000 7fadcdb048c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:43:51.218 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:43:51.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:51.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:51.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:51.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:51.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:43:51.450 INFO:tasks.workunit.client.0.vm06.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2030384809,v1:127.0.0.1:6811/2030384809] [v2:127.0.0.1:6812/2030384809,v1:127.0.0.1:6813/2030384809] exists,up 09a26b5f-ba58-4691-86d1-367c53d472a1 2026-03-08T22:43:51.450 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:51.450 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:51.450 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:51.450 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:342: TEST_backfill_pool_priority: seq 1 3 2026-03-08T22:43:51.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:342: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:43:51.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:344: TEST_backfill_pool_priority: create_pool test1 1 1 2026-03-08T22:43:51.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test1 1 1 2026-03-08T22:43:51.692 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test1' already exists 2026-03-08T22:43:51.704 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:43:52.705 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:345: TEST_backfill_pool_priority: ceph osd pool set test1 size 2 2026-03-08T22:43:53.041 INFO:tasks.workunit.client.0.vm06.stderr:set pool 1 size to 2 2026-03-08T22:43:53.058 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:342: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:43:53.058 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:344: TEST_backfill_pool_priority: create_pool test2 1 1 2026-03-08T22:43:53.058 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test2 1 1 2026-03-08T22:43:53.346 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test2' already exists 2026-03-08T22:43:53.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:43:54.360 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:345: TEST_backfill_pool_priority: ceph osd pool set test2 size 2 2026-03-08T22:43:54.908 INFO:tasks.workunit.client.0.vm06.stderr:set pool 2 size to 2 2026-03-08T22:43:54.922 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:342: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:43:54.923 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:344: TEST_backfill_pool_priority: create_pool test3 1 1 2026-03-08T22:43:54.923 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test3 1 1 2026-03-08T22:43:55.171 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test3' already exists 2026-03-08T22:43:55.185 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:43:56.186 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:345: TEST_backfill_pool_priority: ceph osd pool set test3 size 2 2026-03-08T22:43:56.556 INFO:tasks.workunit.client.0.vm06.stderr:set pool 3 size to 2 2026-03-08T22:43:56.574 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:347: TEST_backfill_pool_priority: sleep 5 2026-03-08T22:44:01.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:349: TEST_backfill_pool_priority: wait_for_clean 2026-03-08T22:44:01.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:01.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:01.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:01.575 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:01.576 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:01.576 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:01.576 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:01.576 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:01.576 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:01.637 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:01.637 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:01.637 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:01.637 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:01.637 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:01.637 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:01.861 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:01.861 INFO:tasks.workunit.client.0.vm06.stderr:1' 2026-03-08T22:44:01.862 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:01.862 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:01.862 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:01.955 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T22:44:01.955 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T22:44:01.955 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T22:44:01.955 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:01.955 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:02.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705669 2026-03-08T22:44:02.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705669 2026-03-08T22:44:02.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-38654705669' 2026-03-08T22:44:02.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:02.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T22:44:02.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:02.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:02.046 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T22:44:02.047 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:02.048 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836486 2026-03-08T22:44:02.048 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T22:44:02.048 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T22:44:02.048 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:02.281 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836486 2026-03-08T22:44:02.281 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:44:03.282 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:44:03.282 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:03.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T22:44:03.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:03.519 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705669 2026-03-08T22:44:03.519 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:03.520 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:03.520 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705669 2026-03-08T22:44:03.520 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:03.521 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705669 2026-03-08T22:44:03.521 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705669 2026-03-08T22:44:03.521 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705669' 2026-03-08T22:44:03.521 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:03.758 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705669 -lt 38654705669 2026-03-08T22:44:03.758 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:03.758 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:03.759 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 3 == 0 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:04.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:04.291 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:44:04.291 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:04.291 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:04.291 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:04.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 3 2026-03-08T22:44:04.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:04.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:04.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:351: TEST_backfill_pool_priority: ceph pg dump pgs 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stdout:3.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:43:56.652219+0000 0'0 27:20 [1,0] 1 [1,0] 1 0'0 2026-03-08T22:43:55.112063+0000 0'0 2026-03-08T22:43:55.112063+0000 0 0 periodic scrub scheduled @ 2026-03-10T06:09:06.377090+0000 0 0 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:43:55.183957+0000 0'0 27:34 [0,1] 0 [0,1] 0 0'0 2026-03-08T22:43:53.274928+0000 0'0 2026-03-08T22:43:53.274928+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:17:12.840999+0000 0 0 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:43:53.306119+0000 0'0 27:40 [1,0] 1 [1,0] 1 0'0 2026-03-08T22:43:51.636847+0000 0'0 2026-03-08T22:43:51.636847+0000 0 0 periodic scrub scheduled @ 2026-03-10T10:33:23.208532+0000 0 0 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:44:04.798 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:355: TEST_backfill_pool_priority: local PG1 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:356: TEST_backfill_pool_priority: local POOLNUM1 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:357: TEST_backfill_pool_priority: local pool1 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:358: TEST_backfill_pool_priority: local chk_osd1_1 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:359: TEST_backfill_pool_priority: local chk_osd1_2 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:361: TEST_backfill_pool_priority: local PG2 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:362: TEST_backfill_pool_priority: local POOLNUM2 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:363: TEST_backfill_pool_priority: local pool2 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:364: TEST_backfill_pool_priority: local chk_osd2_1 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:365: TEST_backfill_pool_priority: local chk_osd2_2 2026-03-08T22:44:04.811 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:367: TEST_backfill_pool_priority: seq 1 3 2026-03-08T22:44:04.812 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:367: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:44:04.812 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:369: TEST_backfill_pool_priority: ceph pg map 1.0 --format=json 2026-03-08T22:44:04.812 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:369: TEST_backfill_pool_priority: jq '.acting[]' 2026-03-08T22:44:05.044 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:370: TEST_backfill_pool_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:44:05.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:370: TEST_backfill_pool_priority: local test_osd1=1 2026-03-08T22:44:05.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:371: TEST_backfill_pool_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:371: TEST_backfill_pool_priority: local test_osd2=0 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:372: TEST_backfill_pool_priority: '[' -z '' ']' 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:374: TEST_backfill_pool_priority: PG1=1.0 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:375: TEST_backfill_pool_priority: POOLNUM1=1 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:376: TEST_backfill_pool_priority: pool1=test1 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:377: TEST_backfill_pool_priority: chk_osd1_1=1 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:378: TEST_backfill_pool_priority: chk_osd1_2=0 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:367: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:44:05.046 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:369: TEST_backfill_pool_priority: ceph pg map 2.0 --format=json 2026-03-08T22:44:05.047 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:369: TEST_backfill_pool_priority: jq '.acting[]' 2026-03-08T22:44:05.274 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:370: TEST_backfill_pool_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:44:05.275 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:370: TEST_backfill_pool_priority: local test_osd1=0 2026-03-08T22:44:05.275 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:371: TEST_backfill_pool_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:371: TEST_backfill_pool_priority: local test_osd2=1 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:372: TEST_backfill_pool_priority: '[' -z 1.0 ']' 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:379: TEST_backfill_pool_priority: '[' 1 '!=' 0 ']' 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:381: TEST_backfill_pool_priority: PG2=2.0 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:382: TEST_backfill_pool_priority: POOLNUM2=2 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:383: TEST_backfill_pool_priority: pool2=test2 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:384: TEST_backfill_pool_priority: chk_osd2_1=0 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:385: TEST_backfill_pool_priority: chk_osd2_2=1 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:386: TEST_backfill_pool_priority: break 2026-03-08T22:44:05.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:389: TEST_backfill_pool_priority: rm -f td/osd-backfill-prio/acting 2026-03-08T22:44:05.277 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:391: TEST_backfill_pool_priority: '[' test2 = '' ']' 2026-03-08T22:44:05.277 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:397: TEST_backfill_pool_priority: seq 1 3 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:397: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:399: TEST_backfill_pool_priority: '[' 1 '!=' 1 -a 1 '!=' 2 ']' 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:397: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:399: TEST_backfill_pool_priority: '[' 2 '!=' 1 -a 2 '!=' 2 ']' 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:397: TEST_backfill_pool_priority: for p in $(seq 1 $pools) 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:399: TEST_backfill_pool_priority: '[' 3 '!=' 1 -a 3 '!=' 2 ']' 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:401: TEST_backfill_pool_priority: delete_pool test3 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test3 2026-03-08T22:44:05.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test3 test3 --yes-i-really-really-mean-it 2026-03-08T22:44:05.561 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test3' does not exist 2026-03-08T22:44:05.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:405: TEST_backfill_pool_priority: pool1_extra_prio=1 2026-03-08T22:44:05.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:406: TEST_backfill_pool_priority: pool2_extra_prio=2 2026-03-08T22:44:05.575 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:408: TEST_backfill_pool_priority: expr 150 + 1 + 1 2026-03-08T22:44:05.576 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:408: TEST_backfill_pool_priority: pool1_prio=152 2026-03-08T22:44:05.577 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:409: TEST_backfill_pool_priority: expr 150 + 1 + 2 2026-03-08T22:44:05.577 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:409: TEST_backfill_pool_priority: pool2_prio=153 2026-03-08T22:44:05.577 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:411: TEST_backfill_pool_priority: ceph osd pool set test1 size 1 --yes-i-really-mean-it 2026-03-08T22:44:05.895 INFO:tasks.workunit.client.0.vm06.stderr:set pool 1 size to 1 2026-03-08T22:44:05.912 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:412: TEST_backfill_pool_priority: ceph osd pool set test1 recovery_priority 1 2026-03-08T22:44:06.206 INFO:tasks.workunit.client.0.vm06.stderr:set pool 1 recovery_priority to 1 2026-03-08T22:44:06.222 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:413: TEST_backfill_pool_priority: ceph osd pool set test2 size 1 --yes-i-really-mean-it 2026-03-08T22:44:06.517 INFO:tasks.workunit.client.0.vm06.stderr:set pool 2 size to 1 2026-03-08T22:44:06.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:414: TEST_backfill_pool_priority: ceph osd pool set test2 recovery_priority 2 2026-03-08T22:44:06.828 INFO:tasks.workunit.client.0.vm06.stderr:set pool 2 recovery_priority to 2 2026-03-08T22:44:06.844 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:415: TEST_backfill_pool_priority: wait_for_clean 2026-03-08T22:44:06.844 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:06.844 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:06.844 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:06.844 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:06.845 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:06.845 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:06.845 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:06.845 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:06.845 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:06.908 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:06.908 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:06.908 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:06.908 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:06.908 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:06.908 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:07.150 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:07.150 INFO:tasks.workunit.client.0.vm06.stderr:1' 2026-03-08T22:44:07.150 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:07.150 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:07.246 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836489 2026-03-08T22:44:07.247 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836489 2026-03-08T22:44:07.247 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489' 2026-03-08T22:44:07.247 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.247 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:07.334 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705672 2026-03-08T22:44:07.334 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705672 2026-03-08T22:44:07.334 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-38654705672' 2026-03-08T22:44:07.334 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:07.334 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836489 2026-03-08T22:44:07.335 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:07.336 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:07.336 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836489 2026-03-08T22:44:07.336 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:07.337 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836489 2026-03-08T22:44:07.337 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836489 2026-03-08T22:44:07.337 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836489' 2026-03-08T22:44:07.338 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:07.570 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836489 2026-03-08T22:44:07.570 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:44:08.571 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:44:08.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:08.793 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836489 2026-03-08T22:44:08.794 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:44:09.795 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:44:09.795 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:10.019 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836489 2026-03-08T22:44:10.019 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:10.019 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705672 2026-03-08T22:44:10.019 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:10.020 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:10.021 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705672 2026-03-08T22:44:10.021 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:10.022 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705672 2026-03-08T22:44:10.022 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705672 2026-03-08T22:44:10.022 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705672' 2026-03-08T22:44:10.022 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:10.245 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705672 -lt 38654705672 2026-03-08T22:44:10.245 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:10.245 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:10.245 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:10.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 2 == 0 2026-03-08T22:44:10.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:10.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:10.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:10.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:10.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:10.540 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:10.540 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:10.765 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:44:10.766 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:10.766 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:10.766 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:11.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 2 2026-03-08T22:44:11.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:11.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:11.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:417: TEST_backfill_pool_priority: dd if=/dev/urandom of=td/osd-backfill-prio/data bs=1M count=10 2026-03-08T22:44:11.101 INFO:tasks.workunit.client.0.vm06.stderr:10+0 records in 2026-03-08T22:44:11.101 INFO:tasks.workunit.client.0.vm06.stderr:10+0 records out 2026-03-08T22:44:11.101 INFO:tasks.workunit.client.0.vm06.stderr:10485760 bytes (10 MB, 10 MiB) copied, 0.0362235 s, 289 MB/s 2026-03-08T22:44:11.101 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:418: TEST_backfill_pool_priority: p=1 2026-03-08T22:44:11.101 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:419: TEST_backfill_pool_priority: for pname in $pool1 $pool2 2026-03-08T22:44:11.102 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: seq 1 50 2026-03-08T22:44:11.103 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.103 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj1-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.171 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.171 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj2-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.230 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.230 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj3-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.292 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.292 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj4-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.355 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.355 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj5-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.418 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.418 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj6-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.477 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.477 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj7-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj8-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.594 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.594 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj9-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.659 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.659 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj10-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.728 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.728 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj11-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.787 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.787 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj12-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.854 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.854 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj13-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.913 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.913 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj14-p1 td/osd-backfill-prio/data 2026-03-08T22:44:11.974 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:11.974 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj15-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.035 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.035 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj16-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.097 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.097 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj17-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj18-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj19-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.274 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.274 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj20-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.336 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.336 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj21-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj22-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.450 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.450 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj23-p1 td/osd-backfill-prio/data 2026-03-08T22:44:12.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:12.513 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj24-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj25-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.188 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.188 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj26-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.458 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.458 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj27-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.528 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.528 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj28-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj29-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.645 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.646 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj30-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.708 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.708 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj31-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj32-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.829 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.829 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj33-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.890 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.890 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj34-p1 td/osd-backfill-prio/data 2026-03-08T22:44:13.945 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:13.945 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj35-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.002 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.002 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj36-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.059 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.059 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj37-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.119 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.119 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj38-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.174 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.174 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj39-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.233 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.233 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj40-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.291 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.291 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj41-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.349 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.350 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj42-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.407 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.407 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj43-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.464 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.464 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj44-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj45-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj46-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.630 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.630 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj47-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.684 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.684 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj48-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.738 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.738 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj49-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.791 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.791 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test1 put obj50-p1 td/osd-backfill-prio/data 2026-03-08T22:44:14.847 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:425: TEST_backfill_pool_priority: expr 1 + 1 2026-03-08T22:44:14.847 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:425: TEST_backfill_pool_priority: p=2 2026-03-08T22:44:14.847 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:419: TEST_backfill_pool_priority: for pname in $pool1 $pool2 2026-03-08T22:44:14.848 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: seq 1 50 2026-03-08T22:44:14.848 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.848 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj1-p2 td/osd-backfill-prio/data 2026-03-08T22:44:14.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj2-p2 td/osd-backfill-prio/data 2026-03-08T22:44:14.961 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:14.961 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj3-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.019 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.020 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj4-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj5-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.133 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.133 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj6-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.189 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.189 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj7-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.247 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.248 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj8-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj9-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.362 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.362 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj10-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.420 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.420 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj11-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj12-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.535 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.535 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj13-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj14-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.647 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.647 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj15-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.711 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.711 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj16-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.767 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.767 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj17-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.821 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.821 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj18-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.881 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.881 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj19-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj20-p2 td/osd-backfill-prio/data 2026-03-08T22:44:15.992 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:15.993 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj21-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.051 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.051 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj22-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.108 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.108 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj23-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.166 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.166 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj24-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.222 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.222 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj25-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.282 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.282 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj26-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.338 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.338 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj27-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj28-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj29-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.508 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.508 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj30-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj31-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj32-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.678 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.678 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj33-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj34-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.791 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.791 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj35-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.846 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.846 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj36-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj37-p2 td/osd-backfill-prio/data 2026-03-08T22:44:16.962 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:16.962 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj38-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.020 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.021 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj39-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.080 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.080 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj40-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.134 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.134 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj41-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.196 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.196 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj42-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.248 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.248 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj43-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.305 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.305 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj44-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj45-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj46-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj47-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.522 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.522 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj48-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.576 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.576 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj49-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.634 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:421: TEST_backfill_pool_priority: for i in $(seq 1 $objects) 2026-03-08T22:44:17.634 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:423: TEST_backfill_pool_priority: rados -p test2 put obj50-p2 td/osd-backfill-prio/data 2026-03-08T22:44:17.691 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:425: TEST_backfill_pool_priority: expr 2 + 1 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:425: TEST_backfill_pool_priority: p=3 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:428: TEST_backfill_pool_priority: get_not_primary test1 obj1-p1 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=test1 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1-p1 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary test1 obj1-p1 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test1 2026-03-08T22:44:17.692 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1-p1 2026-03-08T22:44:17.693 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test1 obj1-p1 2026-03-08T22:44:17.693 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:44:17.921 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:44:17.921 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map test1 obj1-p1 2026-03-08T22:44:17.921 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:44:18.153 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:428: TEST_backfill_pool_priority: local otherosd=null 2026-03-08T22:44:18.154 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:430: TEST_backfill_pool_priority: ceph pg dump pgs 2026-03-08T22:44:18.355 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:44:18.355 INFO:tasks.workunit.client.0.vm06.stdout:2.0 17 0 0 0 0 176160768 0 0 50 0 50 active+clean 2026-03-08T22:44:06.525099+0000 36'50 36:104 [0] 0 [0] 0 0'0 2026-03-08T22:43:53.274928+0000 0'0 2026-03-08T22:43:53.274928+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:31:30.654046+0000 0 0 2026-03-08T22:44:18.355 INFO:tasks.workunit.client.0.vm06.stdout:1.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:44:05.900867+0000 36'150 36:214 [1] 1 [1] 1 0'0 2026-03-08T22:43:51.636847+0000 0'0 2026-03-08T22:43:51.636847+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:29:56.266811+0000 0 0 2026-03-08T22:44:18.355 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:44:18.355 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:44:18.355 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:44:18.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:431: TEST_backfill_pool_priority: ERRORS=0 2026-03-08T22:44:18.368 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:433: TEST_backfill_pool_priority: ceph osd pool set test1 size 2 2026-03-08T22:44:18.870 INFO:tasks.workunit.client.0.vm06.stderr:set pool 1 size to 2 2026-03-08T22:44:18.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:434: TEST_backfill_pool_priority: ceph osd pool set test2 size 2 2026-03-08T22:44:19.180 INFO:tasks.workunit.client.0.vm06.stderr:set pool 2 size to 2 2026-03-08T22:44:19.200 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:435: TEST_backfill_pool_priority: sleep 5 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:436: TEST_backfill_pool_priority: get_asok_path osd.1 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:24.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:44:24.205 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:436: TEST_backfill_pool_priority: CEPH_ARGS= 2026-03-08T22:44:24.205 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:436: TEST_backfill_pool_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout:osd.1 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:437: TEST_backfill_pool_priority: echo osd.1 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:438: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.1.out 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 152, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "item": "2.0", 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 153, 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:44:24.279 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:439: TEST_backfill_pool_priority: get_asok_path osd.0 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:24.280 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.0.asok 2026-03-08T22:44:24.281 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:439: TEST_backfill_pool_priority: CEPH_ARGS= 2026-03-08T22:44:24.281 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:439: TEST_backfill_pool_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:44:24.353 INFO:tasks.workunit.client.0.vm06.stdout:osd.0 2026-03-08T22:44:24.353 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:440: TEST_backfill_pool_priority: echo osd.0 2026-03-08T22:44:24.353 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:441: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.0.out 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "item": "2.0", 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 153, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 152, 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:44:24.354 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:44:24.355 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:444: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.1.out 2026-03-08T22:44:24.355 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:444: TEST_backfill_pool_priority: jq '.local_reservations.in_progress[0].item' 2026-03-08T22:44:24.365 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:444: TEST_backfill_pool_priority: eval 'ITEM="1.0"' 2026-03-08T22:44:24.365 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:444: TEST_backfill_pool_priority: ITEM=1.0 2026-03-08T22:44:24.365 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:445: TEST_backfill_pool_priority: '[' 1.0 '!=' 1.0 ']' 2026-03-08T22:44:24.366 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:450: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.1.out 2026-03-08T22:44:24.366 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:450: TEST_backfill_pool_priority: jq '.local_reservations.in_progress[0].prio' 2026-03-08T22:44:24.378 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:450: TEST_backfill_pool_priority: PRIO=152 2026-03-08T22:44:24.378 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:451: TEST_backfill_pool_priority: '[' 152 '!=' 152 ']' 2026-03-08T22:44:24.378 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:459: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.0.out 2026-03-08T22:44:24.378 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:459: TEST_backfill_pool_priority: jq '.remote_reservations.in_progress[0].item' 2026-03-08T22:44:24.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:459: TEST_backfill_pool_priority: eval 'ITEM="1.0"' 2026-03-08T22:44:24.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:459: TEST_backfill_pool_priority: ITEM=1.0 2026-03-08T22:44:24.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:460: TEST_backfill_pool_priority: '[' 1.0 '!=' 1.0 ']' 2026-03-08T22:44:24.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:465: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.0.out 2026-03-08T22:44:24.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:465: TEST_backfill_pool_priority: jq '.remote_reservations.in_progress[0].prio' 2026-03-08T22:44:24.409 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:465: TEST_backfill_pool_priority: PRIO=152 2026-03-08T22:44:24.409 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:466: TEST_backfill_pool_priority: '[' 152 '!=' 152 ']' 2026-03-08T22:44:24.409 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:474: TEST_backfill_pool_priority: jq '.local_reservations.in_progress[0].item' 2026-03-08T22:44:24.412 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:474: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.0.out 2026-03-08T22:44:24.421 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:474: TEST_backfill_pool_priority: eval 'ITEM="2.0"' 2026-03-08T22:44:24.421 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:474: TEST_backfill_pool_priority: ITEM=2.0 2026-03-08T22:44:24.421 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:475: TEST_backfill_pool_priority: '[' 2.0 '!=' 2.0 ']' 2026-03-08T22:44:24.421 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:480: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.0.out 2026-03-08T22:44:24.421 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:480: TEST_backfill_pool_priority: jq '.local_reservations.in_progress[0].prio' 2026-03-08T22:44:24.431 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:480: TEST_backfill_pool_priority: PRIO=153 2026-03-08T22:44:24.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:481: TEST_backfill_pool_priority: '[' 153 '!=' 153 ']' 2026-03-08T22:44:24.432 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:489: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.1.out 2026-03-08T22:44:24.432 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:489: TEST_backfill_pool_priority: jq '.remote_reservations.in_progress[0].item' 2026-03-08T22:44:24.442 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:489: TEST_backfill_pool_priority: eval 'ITEM="2.0"' 2026-03-08T22:44:24.442 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:489: TEST_backfill_pool_priority: ITEM=2.0 2026-03-08T22:44:24.442 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:490: TEST_backfill_pool_priority: '[' 2.0 '!=' 2.0 ']' 2026-03-08T22:44:24.442 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:495: TEST_backfill_pool_priority: cat td/osd-backfill-prio/dump.1.out 2026-03-08T22:44:24.442 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:495: TEST_backfill_pool_priority: jq '.remote_reservations.in_progress[0].prio' 2026-03-08T22:44:24.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:495: TEST_backfill_pool_priority: PRIO=153 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:496: TEST_backfill_pool_priority: '[' 153 '!=' 153 ']' 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:503: TEST_backfill_pool_priority: wait_for_clean 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:24.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:24.529 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:24.529 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:24.529 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:24.530 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:24.530 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:24.530 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:24.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:24.804 INFO:tasks.workunit.client.0.vm06.stderr:1' 2026-03-08T22:44:24.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:24.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:24.804 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:24.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836494 2026-03-08T22:44:24.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836494 2026-03-08T22:44:24.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494' 2026-03-08T22:44:24.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:24.902 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:25.005 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705677 2026-03-08T22:44:25.005 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705677 2026-03-08T22:44:25.005 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-38654705677' 2026-03-08T22:44:25.005 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:25.005 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836494 2026-03-08T22:44:25.005 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:25.006 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:25.006 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836494 2026-03-08T22:44:25.006 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:25.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836494 2026-03-08T22:44:25.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836494' 2026-03-08T22:44:25.007 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836494 2026-03-08T22:44:25.007 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:25.267 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836494 -lt 21474836494 2026-03-08T22:44:25.267 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:25.268 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705677 2026-03-08T22:44:25.268 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:25.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:25.269 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705677 2026-03-08T22:44:25.269 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:25.270 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705677 2026-03-08T22:44:25.270 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705677 2026-03-08T22:44:25.270 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705677' 2026-03-08T22:44:25.270 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:25.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705677 -lt 38654705677 2026-03-08T22:44:25.502 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:25.502 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:25.502 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:25.850 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 2 == 0 2026-03-08T22:44:25.850 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:25.850 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:25.850 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:25.851 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:25.851 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:25.851 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:25.856 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:26.132 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:44:26.133 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:26.133 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:26.133 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:26.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 2 2026-03-08T22:44:26.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' -1 2026-03-08T22:44:26.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:44:26.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=0 2026-03-08T22:44:26.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:26.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:26.570 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:26.570 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:26.570 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:26.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:26.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:26.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:26.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:26.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:26.835 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:44:26.836 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:26.836 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:26.836 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 2 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:44:27.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:44:27.160 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:44:27.160 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:44:27.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=103045814 2026-03-08T22:44:27.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 103045814 '!=' null 2026-03-08T22:44:27.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:44:27.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:27.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:27.580 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:27.580 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:27.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:27.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:27.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:27.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:27.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:27.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:27.832 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:44:27.832 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:27.833 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:27.833 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 2 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:44:28.158 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:44:28.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=103045814 2026-03-08T22:44:28.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 103045814 '!=' null 2026-03-08T22:44:28.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:44:28.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:28.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:28.571 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:28.572 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:28.804 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:44:28.805 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:28.805 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:28.805 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 2 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:44:29.111 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:44:29.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=91737080 2026-03-08T22:44:29.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 91737080 '!=' null 2026-03-08T22:44:29.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:44:29.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:29.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:29.528 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:29.528 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:29.528 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:29.528 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:29.528 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:29.529 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:29.529 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:29.529 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:29.757 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:44:29.757 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:29.757 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:29.757 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 2 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:44:30.066 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:44:30.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=91737080 2026-03-08T22:44:30.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 91737080 '!=' null 2026-03-08T22:44:30.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:44:30.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:30.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:30.468 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:30.468 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:30.468 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:30.469 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:30.469 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:30.469 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:30.469 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:30.469 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:30.703 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:44:30.703 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:30.704 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:30.704 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 2 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:44:30.999 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:44:31.301 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=93048346 2026-03-08T22:44:31.301 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 93048346 '!=' null 2026-03-08T22:44:31.301 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:44:31.301 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:31.301 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:31.402 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:31.402 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:31.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:31.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:31.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:31.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:31.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:31.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:31.637 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:44:31.637 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:31.637 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:31.637 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:31.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 2 2026-03-08T22:44:31.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 0 2026-03-08T22:44:31.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:44:31.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=1 2026-03-08T22:44:31.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:31.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:32.060 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:32.060 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:32.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:32.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:32.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:32.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:32.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:32.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:32.295 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:44:32.295 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:32.295 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:32.295 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 2 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:44:32.591 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:44:32.592 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:44:32.592 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:44:32.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=93048346 2026-03-08T22:44:32.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 93048346 '!=' null 2026-03-08T22:44:32.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:44:32.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:44:32.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:32.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:33.230 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:44:33.230 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:33.230 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:33.230 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 2 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stdout:TEST PASSED 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:505: TEST_backfill_pool_priority: '[' 0 '!=' 0 ']' 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:509: TEST_backfill_pool_priority: echo TEST PASSED 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:512: TEST_backfill_pool_priority: delete_pool test1 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test1 2026-03-08T22:44:33.518 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test1 test1 --yes-i-really-really-mean-it 2026-03-08T22:44:33.795 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test1' does not exist 2026-03-08T22:44:33.807 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:513: TEST_backfill_pool_priority: delete_pool test2 2026-03-08T22:44:33.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test2 2026-03-08T22:44:33.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test2 test2 --yes-i-really-really-mean-it 2026-03-08T22:44:34.130 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test2' does not exist 2026-03-08T22:44:34.143 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:514: TEST_backfill_pool_priority: kill_daemons td/osd-backfill-prio 2026-03-08T22:44:34.143 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:34.144 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:34.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:34.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:34.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:515: TEST_backfill_pool_priority: return 0 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:43: run: teardown td/osd-backfill-prio 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-prio KILL 2026-03-08T22:44:39.466 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:39.467 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:39.467 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:39.467 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:39.467 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:39.472 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:39.473 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:44:39.473 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:44:39.473 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:44:39.474 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:44:39.474 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:44:39.474 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:44:39.475 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:44:39.475 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:44:39.475 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:44:39.475 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:44:39.476 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:44:39.476 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:44:39.477 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-prio 2026-03-08T22:44:39.491 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:44:39.491 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.491 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.491 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19257 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:40: run: for func in $funcs 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:41: run: setup td/osd-backfill-prio 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-backfill-prio 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-prio KILL 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:39.492 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:39.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:39.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:39.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:39.494 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:39.494 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:44:39.495 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:44:39.495 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:44:39.496 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:44:39.496 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:44:39.496 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:44:39.497 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:44:39.497 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:44:39.497 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:44:39.497 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:44:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:44:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:44:39.498 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-prio 2026-03-08T22:44:39.499 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:44:39.499 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.499 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.499 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19257 2026-03-08T22:44:39.500 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:44:39.500 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:44:39.500 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-backfill-prio 2026-03-08T22:44:39.501 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:44:39.501 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.501 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.501 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.19257 2026-03-08T22:44:39.502 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:44:39.502 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-backfill-prio 1' TERM HUP INT 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:42: run: TEST_backfill_priority td/osd-backfill-prio 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:49: TEST_backfill_priority: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:50: TEST_backfill_priority: local pools=10 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:51: TEST_backfill_priority: local OSDS=5 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:53: TEST_backfill_priority: expr 150 + 1 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:53: TEST_backfill_priority: local degraded_prio=151 2026-03-08T22:44:39.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:54: TEST_backfill_priority: local max_tries=10 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:56: TEST_backfill_priority: run_mon td/osd-backfill-prio a 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-backfill-prio/a 2026-03-08T22:44:39.504 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-backfill-prio/a --run-dir=td/osd-backfill-prio 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:39.542 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-backfill-prio/a '--log-file=td/osd-backfill-prio/$name.log' '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --mon-cluster-log-file=td/osd-backfill-prio/log --run-dir=td/osd-backfill-prio '--pid-file=td/osd-backfill-prio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:44:39.580 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:44:39.581 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:44:39.582 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.582 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:44:39.582 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.582 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-mon.a.asok 2026-03-08T22:44:39.582 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:44:39.582 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19257/ceph-mon.a.asok config get fsid 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.656 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-mon.a.asok 2026-03-08T22:44:39.662 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:44:39.662 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.19257/ceph-mon.a.asok config get mon_host 2026-03-08T22:44:39.730 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:57: TEST_backfill_priority: run_mgr td/osd-backfill-prio x 2026-03-08T22:44:39.730 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.730 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:44:39.730 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:44:39.731 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:44:39.731 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-backfill-prio/x 2026-03-08T22:44:39.731 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:44:39.949 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:44:39.949 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:39.950 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:39.950 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:39.950 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.950 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.950 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:39.950 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:44:39.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-backfill-prio/x '--log-file=td/osd-backfill-prio/$name.log' '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --run-dir=td/osd-backfill-prio '--pid-file=td/osd-backfill-prio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:44:39.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:58: TEST_backfill_priority: export CEPH_ARGS 2026-03-08T22:44:39.973 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: expr 5 - 1 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: seq 0 4 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:62: TEST_backfill_priority: run_osd td/osd-backfill-prio 0 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/0 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/0' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/0/journal' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:39.978 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:44:39.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/0 2026-03-08T22:44:39.980 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:44:39.980 INFO:tasks.workunit.client.0.vm06.stdout:add osd0 4aa3c4a0-f161-409f-a8be-9058f582c29f 2026-03-08T22:44:39.980 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4aa3c4a0-f161-409f-a8be-9058f582c29f 2026-03-08T22:44:39.981 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 4aa3c4a0-f161-409f-a8be-9058f582c29f' 2026-03-08T22:44:39.981 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:44:39.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDX+61pGlhKOxAAPGGARazYCeg8LOmB7ITWSQ== 2026-03-08T22:44:39.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDX+61pGlhKOxAAPGGARazYCeg8LOmB7ITWSQ=="}' 2026-03-08T22:44:39.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4aa3c4a0-f161-409f-a8be-9058f582c29f -i td/osd-backfill-prio/0/new.json 2026-03-08T22:44:40.146 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:44:40.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/0/new.json 2026-03-08T22:44:40.162 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/0 --osd-journal=td/osd-backfill-prio/0/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDX+61pGlhKOxAAPGGARazYCeg8LOmB7ITWSQ== --osd-uuid 4aa3c4a0-f161-409f-a8be-9058f582c29f 2026-03-08T22:44:40.190 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:40.185+0000 7f83e1dea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:40.193 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:40.189+0000 7f83e1dea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:40.195 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:40.189+0000 7f83e1dea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:40.195 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:40.189+0000 7f83e1dea8c0 -1 bdev(0x5568fa5e8c00 td/osd-backfill-prio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:44:40.195 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:40.189+0000 7f83e1dea8c0 -1 bluestore(td/osd-backfill-prio/0) _read_fsid unparsable uuid 2026-03-08T22:44:42.699 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/0/keyring 2026-03-08T22:44:42.699 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:44:42.700 INFO:tasks.workunit.client.0.vm06.stdout:adding osd0 key to auth repository 2026-03-08T22:44:42.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:44:42.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:44:43.000 INFO:tasks.workunit.client.0.vm06.stdout:start osd.0 2026-03-08T22:44:43.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:44:43.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/0 --osd-journal=td/osd-backfill-prio/0/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:43.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:44:43.001 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:44:43.006 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:44:43.026 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:43.017+0000 7fd56199e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:43.028 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:43.025+0000 7fd56199e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:43.030 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:43.025+0000 7fd56199e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:43.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:44:43.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:43.729 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:43.725+0000 7fd56199e8c0 -1 Falling back to public interface 2026-03-08T22:44:44.462 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:44:44.462 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:44.462 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:44.462 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:44.462 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:44.462 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:44:44.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:44.946 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:44.941+0000 7fd56199e8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:44:45.699 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:44:45.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:45.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:45.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:45.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:45.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:44:45.926 INFO:tasks.workunit.client.0.vm06.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/991472859,v1:127.0.0.1:6803/991472859] [v2:127.0.0.1:6804/991472859,v1:127.0.0.1:6805/991472859] exists,up 4aa3c4a0-f161-409f-a8be-9058f582c29f 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:62: TEST_backfill_priority: run_osd td/osd-backfill-prio 1 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/1 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/1' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/1/journal' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:44:45.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:44:45.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/1 2026-03-08T22:44:45.929 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:44:45.929 INFO:tasks.workunit.client.0.vm06.stdout:add osd1 a607aea3-4834-46c3-a649-e6e1f92302d2 2026-03-08T22:44:45.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a607aea3-4834-46c3-a649-e6e1f92302d2 2026-03-08T22:44:45.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 a607aea3-4834-46c3-a649-e6e1f92302d2' 2026-03-08T22:44:45.930 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:44:45.943 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDd+61pcl0wOBAAIuTJtfyB1NNpgSP1rxyG+w== 2026-03-08T22:44:45.943 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDd+61pcl0wOBAAIuTJtfyB1NNpgSP1rxyG+w=="}' 2026-03-08T22:44:45.943 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a607aea3-4834-46c3-a649-e6e1f92302d2 -i td/osd-backfill-prio/1/new.json 2026-03-08T22:44:46.177 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:44:46.192 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/1/new.json 2026-03-08T22:44:46.193 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/1 --osd-journal=td/osd-backfill-prio/1/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDd+61pcl0wOBAAIuTJtfyB1NNpgSP1rxyG+w== --osd-uuid a607aea3-4834-46c3-a649-e6e1f92302d2 2026-03-08T22:44:46.212 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:46.205+0000 7f0226a4b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:46.214 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:46.209+0000 7f0226a4b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:46.215 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:46.209+0000 7f0226a4b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:46.215 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:46.209+0000 7f0226a4b8c0 -1 bdev(0x555f07c05c00 td/osd-backfill-prio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:44:46.215 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:46.209+0000 7f0226a4b8c0 -1 bluestore(td/osd-backfill-prio/1) _read_fsid unparsable uuid 2026-03-08T22:44:49.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/1/keyring 2026-03-08T22:44:49.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:44:49.242 INFO:tasks.workunit.client.0.vm06.stdout:adding osd1 key to auth repository 2026-03-08T22:44:49.243 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:44:49.243 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:44:49.666 INFO:tasks.workunit.client.0.vm06.stdout:start osd.1 2026-03-08T22:44:49.666 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:44:49.666 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/1 --osd-journal=td/osd-backfill-prio/1/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:49.666 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:44:49.666 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:44:49.668 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:44:49.687 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:49.681+0000 7f36c8d408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:49.693 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:49.689+0000 7f36c8d408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:49.695 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:49.689+0000 7f36c8d408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:49.901 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:49.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:44:50.126 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:50.157 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:50.153+0000 7f36c8d408c0 -1 Falling back to public interface 2026-03-08T22:44:51.123 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:51.117+0000 7f36c8d408c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:44:51.127 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:44:51.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:51.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:51.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:51.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:51.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:44:51.365 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:52.367 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:44:52.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:52.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:52.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:52.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:52.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/501360077,v1:127.0.0.1:6811/501360077] [v2:127.0.0.1:6812/501360077,v1:127.0.0.1:6813/501360077] exists,up a607aea3-4834-46c3-a649-e6e1f92302d2 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:62: TEST_backfill_priority: run_osd td/osd-backfill-prio 2 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:44:52.589 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/2 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/2' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/2/journal' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:44:52.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/2 2026-03-08T22:44:52.591 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:44:52.592 INFO:tasks.workunit.client.0.vm06.stdout:add osd2 42579ac2-f9a6-401c-9977-4b51e9819f92 2026-03-08T22:44:52.592 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=42579ac2-f9a6-401c-9977-4b51e9819f92 2026-03-08T22:44:52.592 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 42579ac2-f9a6-401c-9977-4b51e9819f92' 2026-03-08T22:44:52.592 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:44:52.608 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDk+61pO2sqJBAADoryvdW8vPVYRmHCkpwAhw== 2026-03-08T22:44:52.608 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDk+61pO2sqJBAADoryvdW8vPVYRmHCkpwAhw=="}' 2026-03-08T22:44:52.608 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 42579ac2-f9a6-401c-9977-4b51e9819f92 -i td/osd-backfill-prio/2/new.json 2026-03-08T22:44:52.852 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:44:52.865 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/2/new.json 2026-03-08T22:44:52.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/2 --osd-journal=td/osd-backfill-prio/2/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDk+61pO2sqJBAADoryvdW8vPVYRmHCkpwAhw== --osd-uuid 42579ac2-f9a6-401c-9977-4b51e9819f92 2026-03-08T22:44:52.885 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:52.881+0000 7f49241258c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:52.886 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:52.881+0000 7f49241258c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:52.888 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:52.881+0000 7f49241258c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:52.888 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:52.885+0000 7f49241258c0 -1 bdev(0x5556a2e15c00 td/osd-backfill-prio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:44:52.888 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:52.885+0000 7f49241258c0 -1 bluestore(td/osd-backfill-prio/2) _read_fsid unparsable uuid 2026-03-08T22:44:55.154 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/2/keyring 2026-03-08T22:44:55.154 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:44:55.155 INFO:tasks.workunit.client.0.vm06.stdout:adding osd2 key to auth repository 2026-03-08T22:44:55.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:44:55.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:44:55.454 INFO:tasks.workunit.client.0.vm06.stdout:start osd.2 2026-03-08T22:44:55.454 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:44:55.454 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/2 --osd-journal=td/osd-backfill-prio/2/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:55.454 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:44:55.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:44:55.456 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:44:55.471 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:55.465+0000 7f52f65678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:55.472 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:55.469+0000 7f52f65678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:55.474 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:55.469+0000 7f52f65678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:55.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:44:55.918 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:55.933 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:55.929+0000 7f52f65678c0 -1 Falling back to public interface 2026-03-08T22:44:56.919 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:44:56.919 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:56.919 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:56.919 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:56.919 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:56.919 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:44:57.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:57.401 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:57.397+0000 7f52f65678c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:44:58.152 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:44:58.152 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:58.152 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:58.152 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:58.152 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:58.153 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:44:58.395 INFO:tasks.workunit.client.0.vm06.stdout:osd.2 up in weight 1 up_from 13 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/78877505,v1:127.0.0.1:6819/78877505] [v2:127.0.0.1:6820/78877505,v1:127.0.0.1:6821/78877505] exists,up 42579ac2-f9a6-401c-9977-4b51e9819f92 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:62: TEST_backfill_priority: run_osd td/osd-backfill-prio 3 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/3 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/3' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/3/journal' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:44:58.396 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/3 2026-03-08T22:44:58.397 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:44:58.398 INFO:tasks.workunit.client.0.vm06.stdout:add osd3 24964adf-0854-494f-b2ef-cac18c8f29dd 2026-03-08T22:44:58.398 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=24964adf-0854-494f-b2ef-cac18c8f29dd 2026-03-08T22:44:58.398 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 24964adf-0854-494f-b2ef-cac18c8f29dd' 2026-03-08T22:44:58.398 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:44:58.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDq+61pb16AGBAA/+OHRZuhvdOx5pA+D2pTdA== 2026-03-08T22:44:58.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDq+61pb16AGBAA/+OHRZuhvdOx5pA+D2pTdA=="}' 2026-03-08T22:44:58.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 24964adf-0854-494f-b2ef-cac18c8f29dd -i td/osd-backfill-prio/3/new.json 2026-03-08T22:44:58.653 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:44:58.666 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/3/new.json 2026-03-08T22:44:58.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/3 --osd-journal=td/osd-backfill-prio/3/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDq+61pb16AGBAA/+OHRZuhvdOx5pA+D2pTdA== --osd-uuid 24964adf-0854-494f-b2ef-cac18c8f29dd 2026-03-08T22:44:58.685 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:58.681+0000 7f85be77e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:58.687 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:58.681+0000 7f85be77e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:58.690 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:58.685+0000 7f85be77e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:58.690 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:58.685+0000 7f85be77e8c0 -1 bdev(0x55c2978c5c00 td/osd-backfill-prio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:44:58.690 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:44:58.685+0000 7f85be77e8c0 -1 bluestore(td/osd-backfill-prio/3) _read_fsid unparsable uuid 2026-03-08T22:45:01.437 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/3/keyring 2026-03-08T22:45:01.437 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:45:01.438 INFO:tasks.workunit.client.0.vm06.stdout:adding osd3 key to auth repository 2026-03-08T22:45:01.438 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:45:01.438 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:45:01.735 INFO:tasks.workunit.client.0.vm06.stdout:start osd.3 2026-03-08T22:45:01.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:45:01.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/3 --osd-journal=td/osd-backfill-prio/3/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:45:01.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:45:01.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:45:01.740 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:45:01.753 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:01.745+0000 7f46414648c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:01.757 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:01.753+0000 7f46414648c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:01.759 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:01.753+0000 7f46414648c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:45:02.206 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:45:03.208 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:45:03.208 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:45:03.208 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:45:03.208 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:45:03.208 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:45:03.208 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:45:03.209 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:03.205+0000 7f46414648c0 -1 Falling back to public interface 2026-03-08T22:45:03.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:45:04.183 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:04.177+0000 7f46414648c0 -1 osd.3 0 log_to_monitors true 2026-03-08T22:45:04.455 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:45:04.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:45:04.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:45:04.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:45:04.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:45:04.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stdout:osd.3 up in weight 1 up_from 18 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3170442381,v1:127.0.0.1:6827/3170442381] [v2:127.0.0.1:6828/3170442381,v1:127.0.0.1:6829/3170442381] exists,up 24964adf-0854-494f-b2ef-cac18c8f29dd 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:60: TEST_backfill_priority: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:62: TEST_backfill_priority: run_osd td/osd-backfill-prio 4 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-prio 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-prio/4 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq ' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-prio/4' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-prio/4/journal' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:45:04.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-prio' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-prio/$name.log' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-prio/$name.pid' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:45:04.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-prio/4 2026-03-08T22:45:04.687 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:45:04.688 INFO:tasks.workunit.client.0.vm06.stdout:add osd4 502ff400-e00c-453f-9fc2-131649dbcf06 2026-03-08T22:45:04.688 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=502ff400-e00c-453f-9fc2-131649dbcf06 2026-03-08T22:45:04.688 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 502ff400-e00c-453f-9fc2-131649dbcf06' 2026-03-08T22:45:04.688 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:45:04.701 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDw+61pXwvJKRAAtalREuSO6a+hHtOtJcbTuw== 2026-03-08T22:45:04.701 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDw+61pXwvJKRAAtalREuSO6a+hHtOtJcbTuw=="}' 2026-03-08T22:45:04.701 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 502ff400-e00c-453f-9fc2-131649dbcf06 -i td/osd-backfill-prio/4/new.json 2026-03-08T22:45:04.921 INFO:tasks.workunit.client.0.vm06.stdout:4 2026-03-08T22:45:04.933 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-prio/4/new.json 2026-03-08T22:45:04.934 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/4 --osd-journal=td/osd-backfill-prio/4/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDw+61pXwvJKRAAtalREuSO6a+hHtOtJcbTuw== --osd-uuid 502ff400-e00c-453f-9fc2-131649dbcf06 2026-03-08T22:45:04.951 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:04.945+0000 7fa5ec80b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:04.953 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:04.949+0000 7fa5ec80b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:04.954 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:04.949+0000 7fa5ec80b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:04.955 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:04.949+0000 7fa5ec80b8c0 -1 bdev(0x55a17d10fc00 td/osd-backfill-prio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:45:04.955 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:04.949+0000 7fa5ec80b8c0 -1 bluestore(td/osd-backfill-prio/4) _read_fsid unparsable uuid 2026-03-08T22:45:08.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-prio/4/keyring 2026-03-08T22:45:08.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:45:08.159 INFO:tasks.workunit.client.0.vm06.stdout:adding osd4 key to auth repository 2026-03-08T22:45:08.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:45:08.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-prio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:45:08.460 INFO:tasks.workunit.client.0.vm06.stdout:start osd.4 2026-03-08T22:45:08.460 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:45:08.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=af45c927-3099-493c-a376-49f525793d46 --auth-supported=none --mon-host=127.0.0.1:7114 --osd_max_backfills=1 --debug_reserver=20 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-prio/4 --osd-journal=td/osd-backfill-prio/4/journal --chdir= --run-dir=td/osd-backfill-prio '--admin-socket=/tmp/ceph-asok.19257/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-prio/$name.log' '--pid-file=td/osd-backfill-prio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:45:08.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:45:08.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:45:08.468 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:45:08.478 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:08.473+0000 7f2b1c9b58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:08.479 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:08.473+0000 7f2b1c9b58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:08.481 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:08.477+0000 7f2b1c9b58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:08.697 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:45:08.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:45:08.923 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:45:09.177 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:09.173+0000 7f2b1c9b58c0 -1 Falling back to public interface 2026-03-08T22:45:09.924 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:45:09.924 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:45:09.924 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:45:09.924 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:45:09.924 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:45:09.924 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:45:10.197 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:45:10.260 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:45:10.253+0000 7f2b1c9b58c0 -1 osd.4 0 log_to_monitors true 2026-03-08T22:45:11.198 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:45:11.199 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:45:11.199 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:45:11.199 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:45:11.199 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:45:11.199 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:45:11.531 INFO:tasks.workunit.client.0.vm06.stdout:osd.4 up in weight 1 up_from 23 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2514024570,v1:127.0.0.1:6835/2514024570] [v2:127.0.0.1:6836/2514024570,v1:127.0.0.1:6837/2514024570] exists,up 502ff400-e00c-453f-9fc2-131649dbcf06 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: seq 1 10 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test1 1 1 2026-03-08T22:45:11.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test1 1 1 2026-03-08T22:45:12.054 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test1' already exists 2026-03-08T22:45:12.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:13.068 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test1 size 2 2026-03-08T22:45:13.410 INFO:tasks.workunit.client.0.vm06.stderr:set pool 1 size to 2 2026-03-08T22:45:13.431 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:13.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test2 1 1 2026-03-08T22:45:13.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test2 1 1 2026-03-08T22:45:13.672 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test2' already exists 2026-03-08T22:45:13.684 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:14.685 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test2 size 2 2026-03-08T22:45:15.010 INFO:tasks.workunit.client.0.vm06.stderr:set pool 2 size to 2 2026-03-08T22:45:15.029 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:15.029 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test3 1 1 2026-03-08T22:45:15.029 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test3 1 1 2026-03-08T22:45:15.269 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test3' already exists 2026-03-08T22:45:15.284 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:16.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test3 size 2 2026-03-08T22:45:17.034 INFO:tasks.workunit.client.0.vm06.stderr:set pool 3 size to 2 2026-03-08T22:45:17.052 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:17.052 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test4 1 1 2026-03-08T22:45:17.052 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test4 1 1 2026-03-08T22:45:17.296 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test4' already exists 2026-03-08T22:45:17.309 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:18.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test4 size 2 2026-03-08T22:45:18.999 INFO:tasks.workunit.client.0.vm06.stderr:set pool 4 size to 2 2026-03-08T22:45:19.017 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:19.017 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test5 1 1 2026-03-08T22:45:19.017 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test5 1 1 2026-03-08T22:45:19.470 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test5' already exists 2026-03-08T22:45:19.482 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:20.483 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test5 size 2 2026-03-08T22:45:20.807 INFO:tasks.workunit.client.0.vm06.stderr:set pool 5 size to 2 2026-03-08T22:45:20.824 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:20.824 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test6 1 1 2026-03-08T22:45:20.824 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test6 1 1 2026-03-08T22:45:21.067 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test6' already exists 2026-03-08T22:45:21.079 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:22.080 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test6 size 2 2026-03-08T22:45:22.385 INFO:tasks.workunit.client.0.vm06.stderr:set pool 6 size to 2 2026-03-08T22:45:22.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:22.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test7 1 1 2026-03-08T22:45:22.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test7 1 1 2026-03-08T22:45:22.647 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test7' already exists 2026-03-08T22:45:22.658 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:23.660 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test7 size 2 2026-03-08T22:45:24.002 INFO:tasks.workunit.client.0.vm06.stderr:set pool 7 size to 2 2026-03-08T22:45:24.020 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:24.020 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test8 1 1 2026-03-08T22:45:24.020 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test8 1 1 2026-03-08T22:45:24.283 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test8' already exists 2026-03-08T22:45:24.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:25.297 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test8 size 2 2026-03-08T22:45:25.606 INFO:tasks.workunit.client.0.vm06.stderr:set pool 8 size to 2 2026-03-08T22:45:25.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:25.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test9 1 1 2026-03-08T22:45:25.624 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test9 1 1 2026-03-08T22:45:25.871 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test9' already exists 2026-03-08T22:45:25.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:26.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test9 size 2 2026-03-08T22:45:27.237 INFO:tasks.workunit.client.0.vm06.stderr:set pool 9 size to 2 2026-03-08T22:45:27.255 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:65: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:27.255 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:67: TEST_backfill_priority: create_pool test10 1 1 2026-03-08T22:45:27.255 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test10 1 1 2026-03-08T22:45:27.497 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test10' already exists 2026-03-08T22:45:27.510 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:28.511 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:68: TEST_backfill_priority: ceph osd pool set test10 size 2 2026-03-08T22:45:28.870 INFO:tasks.workunit.client.0.vm06.stderr:set pool 10 size to 2 2026-03-08T22:45:28.888 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:70: TEST_backfill_priority: sleep 5 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:72: TEST_backfill_priority: wait_for_clean 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:45:33.889 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:45:33.946 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:45:33.946 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:45:33.946 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:45:33.946 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:45:33.946 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:45:33.946 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:45:34.177 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:45:34.177 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:45:34.177 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:45:34.177 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:45:34.177 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:45:34.177 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:45:34.178 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.178 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:45:34.271 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836492 2026-03-08T22:45:34.271 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836492 2026-03-08T22:45:34.271 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492' 2026-03-08T22:45:34.271 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.271 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:45:34.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705675 2026-03-08T22:45:34.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705675 2026-03-08T22:45:34.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-38654705675' 2026-03-08T22:45:34.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.358 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:45:34.441 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574858 2026-03-08T22:45:34.441 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574858 2026-03-08T22:45:34.441 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-38654705675 2-55834574858' 2026-03-08T22:45:34.441 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.441 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:45:34.526 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411336 2026-03-08T22:45:34.526 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411336 2026-03-08T22:45:34.526 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-38654705675 2-55834574858 3-77309411336' 2026-03-08T22:45:34.526 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.527 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:45:34.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247815 2026-03-08T22:45:34.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247815 2026-03-08T22:45:34.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-38654705675 2-55834574858 3-77309411336 4-98784247815' 2026-03-08T22:45:34.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:34.618 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836492 2026-03-08T22:45:34.618 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:34.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:45:34.619 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836492 2026-03-08T22:45:34.619 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:34.620 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836492 2026-03-08T22:45:34.620 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836492 2026-03-08T22:45:34.620 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836492' 2026-03-08T22:45:34.620 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:34.836 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836492 2026-03-08T22:45:34.836 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:34.837 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705675 2026-03-08T22:45:34.837 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:34.838 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:45:34.838 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705675 2026-03-08T22:45:34.838 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:34.839 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705675 2026-03-08T22:45:34.839 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705675 2026-03-08T22:45:34.839 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705675' 2026-03-08T22:45:34.839 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:45:35.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705675 -lt 38654705675 2026-03-08T22:45:35.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:35.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574858 2026-03-08T22:45:35.070 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:35.070 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:45:35.071 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574858 2026-03-08T22:45:35.071 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:35.072 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574858 2026-03-08T22:45:35.072 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574858 2026-03-08T22:45:35.072 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574858' 2026-03-08T22:45:35.072 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:45:35.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574858 -lt 55834574858 2026-03-08T22:45:35.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:35.311 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411336 2026-03-08T22:45:35.311 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:35.312 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:45:35.312 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411336 2026-03-08T22:45:35.312 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:35.313 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411336 2026-03-08T22:45:35.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411336 2026-03-08T22:45:35.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411336' 2026-03-08T22:45:35.313 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:45:35.532 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411336 -lt 77309411336 2026-03-08T22:45:35.532 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:35.533 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247815 2026-03-08T22:45:35.533 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:35.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:45:35.534 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247815 2026-03-08T22:45:35.534 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:35.535 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247815 2026-03-08T22:45:35.535 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247815 2026-03-08T22:45:35.535 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247815' 2026-03-08T22:45:35.535 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:45:35.752 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247815 -lt 98784247815 2026-03-08T22:45:35.752 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:45:35.752 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:35.752 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 10 == 0 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:45:36.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:45:36.238 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=10 2026-03-08T22:45:36.238 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:45:36.238 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:36.238 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:36.506 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 10 = 10 2026-03-08T22:45:36.506 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:45:36.506 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:45:36.506 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:74: TEST_backfill_priority: ceph pg dump pgs 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:10.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:29.009497+0000 0'0 75:20 [2,1] 2 [2,1] 2 0'0 2026-03-08T22:45:27.442801+0000 0'0 2026-03-08T22:45:27.442801+0000 0 0 periodic scrub scheduled @ 2026-03-10T03:13:37.138016+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:9.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.527309+0000 0'0 75:31 [1,2] 1 [1,2] 1 0'0 2026-03-08T22:45:25.812031+0000 0'0 2026-03-08T22:45:25.812031+0000 0 0 periodic scrub scheduled @ 2026-03-10T02:44:56.412908+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:8.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:26.196309+0000 0'0 75:36 [3,4] 3 [3,4] 3 0'0 2026-03-08T22:45:24.214712+0000 0'0 2026-03-08T22:45:24.214712+0000 0 0 periodic scrub scheduled @ 2026-03-10T08:20:18.645693+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:7.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:24.132028+0000 0'0 75:48 [1,4] 1 [1,4] 1 0'0 2026-03-08T22:45:22.591405+0000 0'0 2026-03-08T22:45:22.591405+0000 0 0 periodic scrub scheduled @ 2026-03-10T00:40:14.001384+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:6.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:22.501583+0000 0'0 75:53 [4,2] 4 [4,2] 4 0'0 2026-03-08T22:45:21.012306+0000 0'0 2026-03-08T22:45:21.012306+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:33:54.597064+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:5.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:21.292922+0000 0'0 75:55 [3,4] 3 [3,4] 3 0'0 2026-03-08T22:45:19.353834+0000 0'0 2026-03-08T22:45:19.353834+0000 0 0 periodic scrub scheduled @ 2026-03-10T10:01:13.820788+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:4.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:19.666350+0000 0'0 75:69 [4,0] 4 [4,0] 4 0'0 2026-03-08T22:45:17.236705+0000 0'0 2026-03-08T22:45:17.236705+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:39:41.222484+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:3.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:17.040403+0000 0'0 75:79 [1,2] 1 [1,2] 1 0'0 2026-03-08T22:45:15.214887+0000 0'0 2026-03-08T22:45:15.214887+0000 0 0 periodic scrub scheduled @ 2026-03-10T00:38:06.820064+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:15.177925+0000 0'0 75:81 [3,1] 3 [3,1] 3 0'0 2026-03-08T22:45:13.615689+0000 0'0 2026-03-08T22:45:13.615689+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:10:57.547334+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:13.517699+0000 0'0 75:98 [1,0] 1 [1,0] 1 0'0 2026-03-08T22:45:11.963744+0000 0'0 2026-03-08T22:45:11.963744+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:18:57.017262+0000 0 0 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:45:36.701 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:78: TEST_backfill_priority: local PG1 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:79: TEST_backfill_priority: local POOLNUM1 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:80: TEST_backfill_priority: local pool1 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:81: TEST_backfill_priority: local chk_osd1_1 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:82: TEST_backfill_priority: local chk_osd1_2 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:84: TEST_backfill_priority: local PG2 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:85: TEST_backfill_priority: local POOLNUM2 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:86: TEST_backfill_priority: local pool2 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:87: TEST_backfill_priority: local chk_osd2 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:89: TEST_backfill_priority: local PG3 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:90: TEST_backfill_priority: local POOLNUM3 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:91: TEST_backfill_priority: local pool3 2026-03-08T22:45:36.712 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: seq 1 10 2026-03-08T22:45:36.713 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:36.713 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 1.0 --format=json 2026-03-08T22:45:36.713 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:36.930 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:36.931 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=1 2026-03-08T22:45:36.931 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=0 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z '' ']' 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:100: TEST_backfill_priority: PG1=1.0 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:101: TEST_backfill_priority: POOLNUM1=1 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:102: TEST_backfill_priority: pool1=test1 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:103: TEST_backfill_priority: chk_osd1_1=1 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:104: TEST_backfill_priority: chk_osd1_2=0 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 2.0 --format=json 2026-03-08T22:45:36.932 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:37.153 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.154 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=3 2026-03-08T22:45:37.154 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=1 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z 1.0 ']' 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:105: TEST_backfill_priority: '[' -z '' -a 1 = 3 -a 0 '!=' 1 ']' 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:111: TEST_backfill_priority: '[' -n '' -a 1 = 3 -a 0 '!=' 1 -a '' '!=' 1 ']' 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 3.0 --format=json 2026-03-08T22:45:37.155 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:37.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.403 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=1 2026-03-08T22:45:37.403 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=2 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z 1.0 ']' 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:105: TEST_backfill_priority: '[' -z '' -a 1 = 1 -a 0 '!=' 2 ']' 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:107: TEST_backfill_priority: PG2=3.0 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:108: TEST_backfill_priority: POOLNUM2=3 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:109: TEST_backfill_priority: pool2=test3 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:110: TEST_backfill_priority: chk_osd2=2 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:37.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 4.0 --format=json 2026-03-08T22:45:37.655 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.656 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=4 2026-03-08T22:45:37.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=0 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z 1.0 ']' 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:105: TEST_backfill_priority: '[' -z 3.0 -a 1 = 4 -a 0 '!=' 0 ']' 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:111: TEST_backfill_priority: '[' -n 3.0 -a 1 = 4 -a 0 '!=' 0 -a 2 '!=' 0 ']' 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 5.0 --format=json 2026-03-08T22:45:37.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:37.878 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.879 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=3 2026-03-08T22:45:37.879 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:37.879 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=4 2026-03-08T22:45:37.880 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z 1.0 ']' 2026-03-08T22:45:37.880 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:105: TEST_backfill_priority: '[' -z 3.0 -a 1 = 3 -a 0 '!=' 4 ']' 2026-03-08T22:45:37.880 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:111: TEST_backfill_priority: '[' -n 3.0 -a 1 = 3 -a 0 '!=' 4 -a 2 '!=' 4 ']' 2026-03-08T22:45:37.880 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:37.880 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 6.0 --format=json 2026-03-08T22:45:37.880 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:38.108 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:38.108 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=4 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=2 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z 1.0 ']' 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:105: TEST_backfill_priority: '[' -z 3.0 -a 1 = 4 -a 0 '!=' 2 ']' 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:111: TEST_backfill_priority: '[' -n 3.0 -a 1 = 4 -a 0 '!=' 2 -a 2 '!=' 2 ']' 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:93: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: ceph pg map 7.0 --format=json 2026-03-08T22:45:38.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:95: TEST_backfill_priority: jq '.acting[]' 2026-03-08T22:45:38.387 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: head -1 td/osd-backfill-prio/acting 2026-03-08T22:45:38.387 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:96: TEST_backfill_priority: local test_osd1=1 2026-03-08T22:45:38.388 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: tail -1 td/osd-backfill-prio/acting 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:97: TEST_backfill_priority: local test_osd2=4 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:98: TEST_backfill_priority: '[' -z 1.0 ']' 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:105: TEST_backfill_priority: '[' -z 3.0 -a 1 = 1 -a 0 '!=' 4 ']' 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:111: TEST_backfill_priority: '[' -n 3.0 -a 1 = 1 -a 0 '!=' 4 -a 2 '!=' 4 ']' 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:113: TEST_backfill_priority: PG3=7.0 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:114: TEST_backfill_priority: POOLNUM3=7 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:115: TEST_backfill_priority: pool3=test7 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:116: TEST_backfill_priority: break 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:119: TEST_backfill_priority: rm -f td/osd-backfill-prio/acting 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:121: TEST_backfill_priority: '[' test3 = '' -o pool3 = '' ']' 2026-03-08T22:45:38.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: seq 1 10 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 1 '!=' 1 -a 1 '!=' 3 -a 1 '!=' 7 ']' 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 2 '!=' 1 -a 2 '!=' 3 -a 2 '!=' 7 ']' 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test2 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test2 2026-03-08T22:45:38.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test2 test2 --yes-i-really-really-mean-it 2026-03-08T22:45:38.611 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test2' does not exist 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 3 '!=' 1 -a 3 '!=' 3 -a 3 '!=' 7 ']' 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 4 '!=' 1 -a 4 '!=' 3 -a 4 '!=' 7 ']' 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test4 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test4 2026-03-08T22:45:38.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test4 test4 --yes-i-really-really-mean-it 2026-03-08T22:45:38.918 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test4' does not exist 2026-03-08T22:45:38.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:38.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 5 '!=' 1 -a 5 '!=' 3 -a 5 '!=' 7 ']' 2026-03-08T22:45:38.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test5 2026-03-08T22:45:38.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test5 2026-03-08T22:45:38.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test5 test5 --yes-i-really-really-mean-it 2026-03-08T22:45:39.228 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test5' does not exist 2026-03-08T22:45:39.239 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:39.239 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 6 '!=' 1 -a 6 '!=' 3 -a 6 '!=' 7 ']' 2026-03-08T22:45:39.239 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test6 2026-03-08T22:45:39.239 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test6 2026-03-08T22:45:39.239 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test6 test6 --yes-i-really-really-mean-it 2026-03-08T22:45:39.500 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test6' does not exist 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 7 '!=' 1 -a 7 '!=' 3 -a 7 '!=' 7 ']' 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 8 '!=' 1 -a 8 '!=' 3 -a 8 '!=' 7 ']' 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test8 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test8 2026-03-08T22:45:39.512 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test8 test8 --yes-i-really-really-mean-it 2026-03-08T22:45:39.771 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test8' does not exist 2026-03-08T22:45:39.783 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:39.783 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 9 '!=' 1 -a 9 '!=' 3 -a 9 '!=' 7 ']' 2026-03-08T22:45:39.783 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test9 2026-03-08T22:45:39.783 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test9 2026-03-08T22:45:39.783 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test9 test9 --yes-i-really-really-mean-it 2026-03-08T22:45:40.231 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test9' does not exist 2026-03-08T22:45:40.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:127: TEST_backfill_priority: for p in $(seq 1 $pools) 2026-03-08T22:45:40.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:129: TEST_backfill_priority: '[' 10 '!=' 1 -a 10 '!=' 3 -a 10 '!=' 7 ']' 2026-03-08T22:45:40.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:131: TEST_backfill_priority: delete_pool test10 2026-03-08T22:45:40.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test10 2026-03-08T22:45:40.242 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test10 test10 --yes-i-really-really-mean-it 2026-03-08T22:45:40.555 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test10' does not exist 2026-03-08T22:45:40.567 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:135: TEST_backfill_priority: ceph osd pool set test3 size 1 --yes-i-really-mean-it 2026-03-08T22:45:40.916 INFO:tasks.workunit.client.0.vm06.stderr:set pool 3 size to 1 2026-03-08T22:45:40.929 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:136: TEST_backfill_priority: ceph osd pool set test7 size 1 --yes-i-really-mean-it 2026-03-08T22:45:41.250 INFO:tasks.workunit.client.0.vm06.stderr:set pool 7 size to 1 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:137: TEST_backfill_priority: wait_for_clean 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:45:41.266 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:45:41.320 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:45:41.320 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:45:41.320 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:45:41.320 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:45:41.320 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:45:41.321 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.527 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:45:41.610 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836496 2026-03-08T22:45:41.610 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836496 2026-03-08T22:45:41.610 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496' 2026-03-08T22:45:41.610 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.610 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:45:41.691 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705678 2026-03-08T22:45:41.691 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705678 2026-03-08T22:45:41.691 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-38654705678' 2026-03-08T22:45:41.691 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.691 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:45:41.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574861 2026-03-08T22:45:41.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574861 2026-03-08T22:45:41.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-38654705678 2-55834574861' 2026-03-08T22:45:41.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.772 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:45:41.860 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411340 2026-03-08T22:45:41.860 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411340 2026-03-08T22:45:41.860 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-38654705678 2-55834574861 3-77309411340' 2026-03-08T22:45:41.861 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:45:41.944 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247819 2026-03-08T22:45:41.944 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247819 2026-03-08T22:45:41.944 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-38654705678 2-55834574861 3-77309411340 4-98784247819' 2026-03-08T22:45:41.944 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:41.944 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836496 2026-03-08T22:45:41.944 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:41.945 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:45:41.946 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836496 2026-03-08T22:45:41.946 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:41.947 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836496 2026-03-08T22:45:41.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836496 2026-03-08T22:45:41.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836496' 2026-03-08T22:45:41.947 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:42.173 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836496 2026-03-08T22:45:42.174 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:45:43.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:45:43.175 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:43.392 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836496 2026-03-08T22:45:43.392 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:43.392 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705678 2026-03-08T22:45:43.393 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:43.393 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:45:43.394 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705678 2026-03-08T22:45:43.394 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:43.395 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705678 2026-03-08T22:45:43.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705678 2026-03-08T22:45:43.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705678' 2026-03-08T22:45:43.395 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:45:43.614 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705679 -lt 38654705678 2026-03-08T22:45:43.614 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:43.615 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574861 2026-03-08T22:45:43.615 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:43.616 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:45:43.616 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574861 2026-03-08T22:45:43.616 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:43.618 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574861 2026-03-08T22:45:43.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574861 2026-03-08T22:45:43.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574861' 2026-03-08T22:45:43.618 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:45:43.831 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574861 -lt 55834574861 2026-03-08T22:45:43.831 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:43.831 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411340 2026-03-08T22:45:43.831 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:43.832 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:45:43.832 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411340 2026-03-08T22:45:43.832 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:43.833 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411340 2026-03-08T22:45:43.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411340 2026-03-08T22:45:43.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411340' 2026-03-08T22:45:43.833 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:45:44.048 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411340 -lt 77309411340 2026-03-08T22:45:44.048 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:44.048 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247819 2026-03-08T22:45:44.048 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:44.049 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:45:44.049 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247819 2026-03-08T22:45:44.049 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:44.050 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247819 2026-03-08T22:45:44.050 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247819 2026-03-08T22:45:44.050 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247819' 2026-03-08T22:45:44.050 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:45:44.270 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247819 -lt 98784247819 2026-03-08T22:45:44.270 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:45:44.270 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:44.270 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:44.567 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 3 == 0 2026-03-08T22:45:44.567 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:45:44.567 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:45:44.567 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:45:44.567 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:45:44.567 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:45:44.568 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:45:44.568 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:45:44.789 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:45:44.789 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:45:44.789 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:44.789 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:45.073 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 3 2026-03-08T22:45:45.073 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:45:45.073 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:45:45.073 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:139: TEST_backfill_priority: dd if=/dev/urandom of=td/osd-backfill-prio/data bs=1M count=10 2026-03-08T22:45:45.098 INFO:tasks.workunit.client.0.vm06.stderr:10+0 records in 2026-03-08T22:45:45.098 INFO:tasks.workunit.client.0.vm06.stderr:10+0 records out 2026-03-08T22:45:45.098 INFO:tasks.workunit.client.0.vm06.stderr:10485760 bytes (10 MB, 10 MiB) copied, 0.0240671 s, 436 MB/s 2026-03-08T22:45:45.098 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:140: TEST_backfill_priority: p=1 2026-03-08T22:45:45.098 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:141: TEST_backfill_priority: for pname in $pool1 $pool2 $pool3 2026-03-08T22:45:45.098 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: seq 1 50 2026-03-08T22:45:45.099 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.099 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj1-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj2-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.246 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.246 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj3-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.319 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj4-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj5-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.468 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.468 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj6-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.543 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.543 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj7-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj8-p1 td/osd-backfill-prio/data 2026-03-08T22:45:45.694 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:45.694 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj9-p1 td/osd-backfill-prio/data 2026-03-08T22:45:46.105 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:46.105 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj10-p1 td/osd-backfill-prio/data 2026-03-08T22:45:47.553 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:47.553 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj11-p1 td/osd-backfill-prio/data 2026-03-08T22:45:47.765 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:47.765 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj12-p1 td/osd-backfill-prio/data 2026-03-08T22:45:47.834 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:47.834 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj13-p1 td/osd-backfill-prio/data 2026-03-08T22:45:47.909 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:47.910 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj14-p1 td/osd-backfill-prio/data 2026-03-08T22:45:47.982 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:47.982 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj15-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.051 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.051 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj16-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.123 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.123 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj17-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.258 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.258 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj18-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.333 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.333 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj19-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.407 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.407 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj20-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.482 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.482 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj21-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.556 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.556 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj22-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.627 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.627 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj23-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.697 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.697 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj24-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.767 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.767 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj25-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.837 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.837 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj26-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.906 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.906 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj27-p1 td/osd-backfill-prio/data 2026-03-08T22:45:48.976 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:48.976 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj28-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj29-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.114 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.114 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj30-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.180 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.180 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj31-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.249 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.249 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj32-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj33-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj34-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj35-p1 td/osd-backfill-prio/data 2026-03-08T22:45:49.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:49.977 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj36-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj37-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj38-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.293 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.293 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj39-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.363 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.363 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj40-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.433 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj41-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj42-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.581 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.582 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj43-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.656 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.656 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj44-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.756 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.756 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj45-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj46-p1 td/osd-backfill-prio/data 2026-03-08T22:45:50.961 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:50.961 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj47-p1 td/osd-backfill-prio/data 2026-03-08T22:45:51.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj48-p1 td/osd-backfill-prio/data 2026-03-08T22:45:51.178 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.178 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj49-p1 td/osd-backfill-prio/data 2026-03-08T22:45:51.290 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.290 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test1 put obj50-p1 td/osd-backfill-prio/data 2026-03-08T22:45:51.388 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:147: TEST_backfill_priority: expr 1 + 1 2026-03-08T22:45:51.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:147: TEST_backfill_priority: p=2 2026-03-08T22:45:51.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:141: TEST_backfill_priority: for pname in $pool1 $pool2 $pool3 2026-03-08T22:45:51.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: seq 1 50 2026-03-08T22:45:51.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj1-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj2-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.516 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.516 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj3-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.575 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj4-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.636 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.636 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj5-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj6-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.785 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.785 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj7-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj8-p2 td/osd-backfill-prio/data 2026-03-08T22:45:51.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:51.928 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj9-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj10-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj11-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.192 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.192 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj12-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.279 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj13-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.376 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.376 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj14-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.456 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.456 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj15-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.541 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.541 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj16-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.615 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.615 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj17-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.686 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj18-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj19-p2 td/osd-backfill-prio/data 2026-03-08T22:45:52.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:52.971 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj20-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.065 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.065 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj21-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.135 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.135 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj22-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.219 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj23-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.426 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.426 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj24-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.513 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.513 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj25-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.605 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.605 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj26-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.687 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.687 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj27-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.764 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.764 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj28-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.832 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.832 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj29-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj30-p2 td/osd-backfill-prio/data 2026-03-08T22:45:53.983 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:53.983 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj31-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.065 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj32-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj33-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.197 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.197 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj34-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.259 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.259 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj35-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.322 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.322 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj36-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.437 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.438 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj37-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj38-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.557 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.557 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj39-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.611 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.612 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj40-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.672 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.672 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj41-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.733 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.733 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj42-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.792 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.792 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj43-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.852 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.852 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj44-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.907 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.907 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj45-p2 td/osd-backfill-prio/data 2026-03-08T22:45:54.960 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:54.960 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj46-p2 td/osd-backfill-prio/data 2026-03-08T22:45:55.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj47-p2 td/osd-backfill-prio/data 2026-03-08T22:45:55.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj48-p2 td/osd-backfill-prio/data 2026-03-08T22:45:55.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj49-p2 td/osd-backfill-prio/data 2026-03-08T22:45:55.194 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.194 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test3 put obj50-p2 td/osd-backfill-prio/data 2026-03-08T22:45:55.249 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:147: TEST_backfill_priority: expr 2 + 1 2026-03-08T22:45:55.250 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:147: TEST_backfill_priority: p=3 2026-03-08T22:45:55.250 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:141: TEST_backfill_priority: for pname in $pool1 $pool2 $pool3 2026-03-08T22:45:55.250 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: seq 1 50 2026-03-08T22:45:55.251 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.251 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj1-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.327 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.327 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj2-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.417 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.417 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj3-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.477 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.477 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj4-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.535 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.535 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj5-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.590 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj6-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.641 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.641 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj7-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj8-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.760 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.760 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj9-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.813 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.813 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj10-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.869 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.870 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj11-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.934 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.934 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj12-p3 td/osd-backfill-prio/data 2026-03-08T22:45:55.989 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:55.989 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj13-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.058 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.058 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj14-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.107 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.107 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj15-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj16-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.218 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj17-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.277 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.277 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj18-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.334 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.335 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj19-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj20-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.447 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.447 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj21-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.507 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.507 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj22-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.566 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.566 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj23-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj24-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj25-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.731 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.731 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj26-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.784 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.784 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj27-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.840 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.840 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj28-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.898 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.899 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj29-p3 td/osd-backfill-prio/data 2026-03-08T22:45:56.956 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:56.956 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj30-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.012 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.012 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj31-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj32-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.119 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.119 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj33-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.697 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.697 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj34-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.758 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.758 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj35-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.812 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.812 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj36-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.865 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.865 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj37-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.922 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.922 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj38-p3 td/osd-backfill-prio/data 2026-03-08T22:45:57.979 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:57.980 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj39-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj40-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.090 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.090 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj41-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.147 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.147 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj42-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.203 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.203 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj43-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.257 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.257 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj44-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj45-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.380 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.380 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj46-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.476 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.476 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj47-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.555 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.555 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj48-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.642 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.642 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj49-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:143: TEST_backfill_priority: for i in $(seq 1 $objects) 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:145: TEST_backfill_priority: rados -p test7 put obj50-p3 td/osd-backfill-prio/data 2026-03-08T22:45:58.826 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:147: TEST_backfill_priority: expr 3 + 1 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:147: TEST_backfill_priority: p=4 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:150: TEST_backfill_priority: get_not_primary test1 obj1-p1 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=test1 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1-p1 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary test1 obj1-p1 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test1 2026-03-08T22:45:58.827 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1-p1 2026-03-08T22:45:58.828 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test1 obj1-p1 2026-03-08T22:45:58.828 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:45:59.059 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:45:59.059 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map test1 obj1-p1 2026-03-08T22:45:59.059 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:45:59.294 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:150: TEST_backfill_priority: local otherosd=0 2026-03-08T22:45:59.294 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:152: TEST_backfill_priority: ceph pg dump pgs 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stdout:7.0 32 0 0 0 0 335544320 0 0 96 0 96 active+clean 2026-03-08T22:45:41.254369+0000 86'96 86:164 [1] 1 [1] 1 0'0 2026-03-08T22:45:22.591405+0000 0'0 2026-03-08T22:45:22.591405+0000 0 0 periodic scrub scheduled @ 2026-03-10T10:30:06.366300+0000 0 0 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stdout:3.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:45:41.254218+0000 86'150 86:247 [1] 1 [1] 1 0'0 2026-03-08T22:45:15.214887+0000 0'0 2026-03-08T22:45:15.214887+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:10:59.149892+0000 0 0 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stdout:1.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:45:13.517699+0000 86'150 86:264 [1,0] 1 [1,0] 1 0'0 2026-03-08T22:45:11.963744+0000 0'0 2026-03-08T22:45:11.963744+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:18:57.017262+0000 0 0 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:45:59.501 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:45:59.513 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:153: TEST_backfill_priority: ERRORS=0 2026-03-08T22:45:59.515 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:155: TEST_backfill_priority: ceph osd set nobackfill 2026-03-08T22:45:59.831 INFO:tasks.workunit.client.0.vm06.stderr:nobackfill is set 2026-03-08T22:45:59.845 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:156: TEST_backfill_priority: ceph osd set noout 2026-03-08T22:46:00.270 INFO:tasks.workunit.client.0.vm06.stderr:noout is set 2026-03-08T22:46:00.282 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:160: TEST_backfill_priority: ceph osd pool set test7 size 2 2026-03-08T22:46:00.601 INFO:tasks.workunit.client.0.vm06.stderr:set pool 7 size to 2 2026-03-08T22:46:00.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:161: TEST_backfill_priority: sleep 2 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:163: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:163: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:02.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:163: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:02.676 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:02.684 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:166: TEST_backfill_priority: seq 1 10 2026-03-08T22:46:02.684 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:166: TEST_backfill_priority: for i in $(seq 1 $max_tries) 2026-03-08T22:46:02.684 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:168: TEST_backfill_priority: ceph pg force-backfill 7.0 2026-03-08T22:46:02.684 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:168: TEST_backfill_priority: grep -q 'doesn'\''t require backfilling' 2026-03-08T22:46:02.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:169: TEST_backfill_priority: break 2026-03-08T22:46:02.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:177: TEST_backfill_priority: flush_pg_stats 2026-03-08T22:46:02.905 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:02.905 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:03.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:03.144 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:03.311 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836502 2026-03-08T22:46:03.311 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836502 2026-03-08T22:46:03.311 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502' 2026-03-08T22:46:03.311 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:03.312 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:03.393 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705685 2026-03-08T22:46:03.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705685 2026-03-08T22:46:03.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-38654705685' 2026-03-08T22:46:03.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:03.402 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:03.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574867 2026-03-08T22:46:03.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574867 2026-03-08T22:46:03.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-38654705685 2-55834574867' 2026-03-08T22:46:03.479 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:03.479 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:03.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411346 2026-03-08T22:46:03.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411346 2026-03-08T22:46:03.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-38654705685 2-55834574867 3-77309411346' 2026-03-08T22:46:03.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:03.564 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:03.650 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247825 2026-03-08T22:46:03.650 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247825 2026-03-08T22:46:03.650 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-38654705685 2-55834574867 3-77309411346 4-98784247825' 2026-03-08T22:46:03.650 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:03.650 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836502 2026-03-08T22:46:03.650 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:03.651 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:03.651 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836502 2026-03-08T22:46:03.651 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:03.652 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836502 2026-03-08T22:46:03.652 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836502 2026-03-08T22:46:03.653 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836502' 2026-03-08T22:46:03.653 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:03.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836502 2026-03-08T22:46:03.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:04.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:04.868 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:05.208 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836502 -lt 21474836502 2026-03-08T22:46:05.247 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:05.247 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705685 2026-03-08T22:46:05.247 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:05.247 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:05.247 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705685 2026-03-08T22:46:05.247 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:05.248 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705685 2026-03-08T22:46:05.248 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705685 2026-03-08T22:46:05.248 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705685' 2026-03-08T22:46:05.248 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:05.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705685 -lt 38654705685 2026-03-08T22:46:05.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:05.432 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574867 2026-03-08T22:46:05.432 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:05.433 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:05.434 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574867 2026-03-08T22:46:05.434 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:05.435 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574867 2026-03-08T22:46:05.435 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574867 2026-03-08T22:46:05.435 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574867' 2026-03-08T22:46:05.435 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:05.718 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574868 -lt 55834574867 2026-03-08T22:46:05.718 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:05.719 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411346 2026-03-08T22:46:05.719 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:05.719 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:05.720 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411346 2026-03-08T22:46:05.720 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:05.721 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411346 2026-03-08T22:46:05.721 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411346 2026-03-08T22:46:05.721 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411346' 2026-03-08T22:46:05.721 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:05.948 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411346 -lt 77309411346 2026-03-08T22:46:05.948 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:05.948 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247825 2026-03-08T22:46:05.948 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:05.950 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:05.950 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247825 2026-03-08T22:46:05.950 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:05.951 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247825 2026-03-08T22:46:05.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247825 2026-03-08T22:46:05.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247825' 2026-03-08T22:46:05.951 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247825 -lt 98784247825 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:178: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:178: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:06.200 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:178: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:06.258 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:06.258 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:06.259 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:06.267 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:180: TEST_backfill_priority: ceph osd out osd.0 2026-03-08T22:46:06.536 INFO:tasks.workunit.client.0.vm06.stderr:osd.0 is already out. 2026-03-08T22:46:06.549 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:181: TEST_backfill_priority: sleep 2 2026-03-08T22:46:08.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:182: TEST_backfill_priority: flush_pg_stats 2026-03-08T22:46:08.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:08.550 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:08.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:08.773 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:08.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T22:46:08.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T22:46:08.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T22:46:08.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:08.859 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:08.942 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705688 2026-03-08T22:46:08.942 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705688 2026-03-08T22:46:08.942 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-38654705688' 2026-03-08T22:46:08.942 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:08.942 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:09.027 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574871 2026-03-08T22:46:09.027 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574871 2026-03-08T22:46:09.027 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-38654705688 2-55834574871' 2026-03-08T22:46:09.027 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:09.027 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:09.122 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411349 2026-03-08T22:46:09.122 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411349 2026-03-08T22:46:09.122 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-38654705688 2-55834574871 3-77309411349' 2026-03-08T22:46:09.122 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:09.122 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:09.213 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247828 2026-03-08T22:46:09.213 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247828 2026-03-08T22:46:09.213 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-38654705688 2-55834574871 3-77309411349 4-98784247828' 2026-03-08T22:46:09.213 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:09.213 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T22:46:09.213 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:09.214 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:09.215 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T22:46:09.215 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:09.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T22:46:09.216 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836505 2026-03-08T22:46:09.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T22:46:09.216 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:09.433 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 0 -lt 21474836505 2026-03-08T22:46:09.433 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:10.434 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:10.434 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:10.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 0 -lt 21474836505 2026-03-08T22:46:10.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:11.677 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:46:11.677 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:12.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T22:46:12.064 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:12.065 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705688 2026-03-08T22:46:12.065 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:12.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:12.066 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705688 2026-03-08T22:46:12.066 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:12.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705688 2026-03-08T22:46:12.067 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705688 2026-03-08T22:46:12.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705688' 2026-03-08T22:46:12.067 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:12.292 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705688 -lt 38654705688 2026-03-08T22:46:12.292 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:12.292 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574871 2026-03-08T22:46:12.292 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:12.293 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:12.294 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574871 2026-03-08T22:46:12.294 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:12.295 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574871 2026-03-08T22:46:12.295 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574871' 2026-03-08T22:46:12.295 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574871 2026-03-08T22:46:12.295 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:12.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574871 -lt 55834574871 2026-03-08T22:46:12.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:12.540 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411349 2026-03-08T22:46:12.540 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:12.541 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:12.541 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411349 2026-03-08T22:46:12.541 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:12.542 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411349 2026-03-08T22:46:12.542 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411349 2026-03-08T22:46:12.542 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411349' 2026-03-08T22:46:12.542 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:12.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411350 -lt 77309411349 2026-03-08T22:46:12.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:12.774 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247828 2026-03-08T22:46:12.774 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:12.775 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:12.775 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247828 2026-03-08T22:46:12.775 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:12.777 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247828 2026-03-08T22:46:12.777 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247828 2026-03-08T22:46:12.777 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247828' 2026-03-08T22:46:12.777 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:12.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247829 -lt 98784247828 2026-03-08T22:46:12.998 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:183: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:12.998 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:12.998 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:12.999 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:12.999 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:12.999 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:12.999 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:13.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:183: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:13.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:183: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:13.059 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:13.060 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:13.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:184: TEST_backfill_priority: ceph pg dump pgs 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stdout:7.0 50 0 50 0 0 524288000 0 0 50 100 50 active+undersized+degraded+remapped+backfilling+forced_backfill 2026-03-08T22:46:02.842783+0000 86'150 97:240 [1,4] 1 [1] 1 0'0 2026-03-08T22:45:22.591405+0000 0'0 2026-03-08T22:45:22.591405+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stdout:3.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:45:41.254218+0000 86'150 97:265 [1] 1 [1] 1 0'0 2026-03-08T22:45:15.214887+0000 0'0 2026-03-08T22:45:15.214887+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:10:59.149892+0000 0 0 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stdout:1.0 50 0 0 50 0 524288000 0 0 50 100 50 active+remapped+backfill_wait 2026-03-08T22:46:08.459693+0000 86'150 97:285 [1,2] 1 [1,0] 1 0'0 2026-03-08T22:45:11.963744+0000 0'0 2026-03-08T22:45:11.963744+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:13.275 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:46:13.288 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:186: TEST_backfill_priority: ceph osd pool set test3 size 2 2026-03-08T22:46:13.602 INFO:tasks.workunit.client.0.vm06.stderr:set pool 3 size to 2 2026-03-08T22:46:13.616 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:187: TEST_backfill_priority: sleep 2 2026-03-08T22:46:15.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:188: TEST_backfill_priority: flush_pg_stats 2026-03-08T22:46:15.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:15.618 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:15.857 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:15.948 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836508 2026-03-08T22:46:15.948 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836508 2026-03-08T22:46:15.949 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508' 2026-03-08T22:46:15.949 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:15.949 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:16.033 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705691 2026-03-08T22:46:16.033 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705691 2026-03-08T22:46:16.033 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-38654705691' 2026-03-08T22:46:16.033 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:16.034 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:16.129 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574874 2026-03-08T22:46:16.129 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574874 2026-03-08T22:46:16.129 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-38654705691 2-55834574874' 2026-03-08T22:46:16.129 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:16.129 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:16.224 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411353 2026-03-08T22:46:16.224 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411353 2026-03-08T22:46:16.224 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-38654705691 2-55834574874 3-77309411353' 2026-03-08T22:46:16.224 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:16.225 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:16.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247831 2026-03-08T22:46:16.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247831 2026-03-08T22:46:16.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-38654705691 2-55834574874 3-77309411353 4-98784247831' 2026-03-08T22:46:16.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:16.313 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836508 2026-03-08T22:46:16.313 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:16.314 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:16.315 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836508 2026-03-08T22:46:16.315 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:16.316 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836508 2026-03-08T22:46:16.316 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836508 2026-03-08T22:46:16.316 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836508' 2026-03-08T22:46:16.316 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:16.558 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836506 -lt 21474836508 2026-03-08T22:46:16.558 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:17.559 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:17.559 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:17.789 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836509 -lt 21474836508 2026-03-08T22:46:17.789 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:17.789 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705691 2026-03-08T22:46:17.790 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:17.791 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:17.791 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705691 2026-03-08T22:46:17.791 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:17.792 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705691 2026-03-08T22:46:17.793 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705691 2026-03-08T22:46:17.793 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705691' 2026-03-08T22:46:17.793 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:18.033 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705691 -lt 38654705691 2026-03-08T22:46:18.033 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:18.033 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574874 2026-03-08T22:46:18.033 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:18.034 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:18.034 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574874 2026-03-08T22:46:18.035 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:18.036 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574874 2026-03-08T22:46:18.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574874 2026-03-08T22:46:18.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574874' 2026-03-08T22:46:18.036 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:18.357 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574874 -lt 55834574874 2026-03-08T22:46:18.357 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:18.357 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411353 2026-03-08T22:46:18.357 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:18.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:18.359 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411353 2026-03-08T22:46:18.359 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:18.359 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411353 2026-03-08T22:46:18.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411353 2026-03-08T22:46:18.360 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411353' 2026-03-08T22:46:18.360 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:18.597 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411353 -lt 77309411353 2026-03-08T22:46:18.597 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:18.597 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247831 2026-03-08T22:46:18.597 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:18.599 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:18.599 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247831 2026-03-08T22:46:18.599 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:18.600 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247831 2026-03-08T22:46:18.600 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247831 2026-03-08T22:46:18.600 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247831' 2026-03-08T22:46:18.601 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247832 -lt 98784247831 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:189: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:18.832 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:18.833 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:18.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:189: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:18.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:189: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:190: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:18.902 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:18.903 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:191: TEST_backfill_priority: ceph pg dump pgs 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stdout:7.0 50 0 50 0 0 524288000 0 0 50 100 50 active+undersized+degraded+remapped+backfilling+forced_backfill 2026-03-08T22:46:02.842783+0000 86'150 101:248 [1,4] 1 [1] 1 0'0 2026-03-08T22:45:22.591405+0000 0'0 2026-03-08T22:45:22.591405+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stdout:3.0 50 0 50 0 0 524288000 0 0 50 100 50 active+undersized+degraded+remapped+backfill_wait 2026-03-08T22:46:15.611079+0000 86'150 101:274 [1,2] 1 [1] 1 0'0 2026-03-08T22:45:15.214887+0000 0'0 2026-03-08T22:45:15.214887+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stdout:1.0 50 0 0 50 0 524288000 0 0 50 100 50 active+remapped+backfill_wait 2026-03-08T22:46:08.459693+0000 86'150 101:293 [1,2] 1 [1,0] 1 0'0 2026-03-08T22:45:11.963744+0000 0'0 2026-03-08T22:45:11.963744+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:19.116 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:46:19.129 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:193: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:19.129 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:193: TEST_backfill_priority: jq '(.local_reservations.queues[].items[] | select(.item == "1.0")).prio' 2026-03-08T22:46:19.139 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:193: TEST_backfill_priority: PRIO=110 2026-03-08T22:46:19.139 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:194: TEST_backfill_priority: '[' 110 '!=' 110 ']' 2026-03-08T22:46:19.139 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:201: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:19.140 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:201: TEST_backfill_priority: jq '.local_reservations.in_progress[0].item' 2026-03-08T22:46:19.150 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:201: TEST_backfill_priority: eval 'ITEM="7.0"' 2026-03-08T22:46:19.150 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:201: TEST_backfill_priority: ITEM=7.0 2026-03-08T22:46:19.150 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:202: TEST_backfill_priority: '[' 7.0 '!=' 7.0 ']' 2026-03-08T22:46:19.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:207: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:19.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:207: TEST_backfill_priority: jq '.local_reservations.in_progress[0].prio' 2026-03-08T22:46:19.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:207: TEST_backfill_priority: PRIO=254 2026-03-08T22:46:19.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:208: TEST_backfill_priority: '[' 254 '!=' 254 ']' 2026-03-08T22:46:19.161 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:216: TEST_backfill_priority: seq 1 10 2026-03-08T22:46:19.162 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:216: TEST_backfill_priority: for i in $(seq 1 $max_tries) 2026-03-08T22:46:19.162 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:218: TEST_backfill_priority: ceph pg force-backfill 3.0 2026-03-08T22:46:19.162 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:218: TEST_backfill_priority: grep -q 'doesn'\''t require backfilling' 2026-03-08T22:46:19.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:219: TEST_backfill_priority: break 2026-03-08T22:46:19.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:227: TEST_backfill_priority: sleep 2 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:228: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:21.387 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:228: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:21.388 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:228: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:21.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:229: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 254, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:21.454 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:230: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:21.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:230: TEST_backfill_priority: jq '(.local_reservations.queues[].items[] | select(.item == "3.0")).prio' 2026-03-08T22:46:21.465 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:230: TEST_backfill_priority: PRIO=254 2026-03-08T22:46:21.465 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:231: TEST_backfill_priority: '[' 254 '!=' 254 ']' 2026-03-08T22:46:21.465 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:236: TEST_backfill_priority: flush_pg_stats 2026-03-08T22:46:21.465 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:21.465 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:21.704 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:21.792 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836512 2026-03-08T22:46:21.792 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836512 2026-03-08T22:46:21.792 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512' 2026-03-08T22:46:21.792 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:21.792 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:21.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705694 2026-03-08T22:46:21.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705694 2026-03-08T22:46:21.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-38654705694' 2026-03-08T22:46:21.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:21.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:21.974 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574877 2026-03-08T22:46:21.974 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574877 2026-03-08T22:46:21.974 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-38654705694 2-55834574877' 2026-03-08T22:46:21.975 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:21.975 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:22.061 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411356 2026-03-08T22:46:22.061 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411356 2026-03-08T22:46:22.061 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-38654705694 2-55834574877 3-77309411356' 2026-03-08T22:46:22.061 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:22.061 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:22.148 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247835 2026-03-08T22:46:22.148 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247835 2026-03-08T22:46:22.148 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-38654705694 2-55834574877 3-77309411356 4-98784247835' 2026-03-08T22:46:22.148 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:22.148 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836512 2026-03-08T22:46:22.148 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:22.149 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:22.150 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836512 2026-03-08T22:46:22.150 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:22.150 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836512 2026-03-08T22:46:22.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836512 2026-03-08T22:46:22.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836512' 2026-03-08T22:46:22.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:22.388 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836509 -lt 21474836512 2026-03-08T22:46:22.388 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:23.390 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:23.390 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:23.620 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836512 2026-03-08T22:46:23.620 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:23.620 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705694 2026-03-08T22:46:23.620 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:23.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:23.622 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705694 2026-03-08T22:46:23.622 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:23.623 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705694 2026-03-08T22:46:23.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705694 2026-03-08T22:46:23.623 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705694' 2026-03-08T22:46:23.623 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:23.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705695 -lt 38654705694 2026-03-08T22:46:23.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:23.856 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574877 2026-03-08T22:46:23.856 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:23.857 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:23.857 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574877 2026-03-08T22:46:23.857 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:23.858 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574877 2026-03-08T22:46:23.858 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574877 2026-03-08T22:46:23.858 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574877' 2026-03-08T22:46:23.858 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:24.084 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574877 -lt 55834574877 2026-03-08T22:46:24.084 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:24.085 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411356 2026-03-08T22:46:24.085 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:24.086 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:24.086 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411356 2026-03-08T22:46:24.086 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:24.087 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411356 2026-03-08T22:46:24.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411356 2026-03-08T22:46:24.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411356' 2026-03-08T22:46:24.087 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:24.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411356 -lt 77309411356 2026-03-08T22:46:24.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:24.310 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247835 2026-03-08T22:46:24.310 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:24.311 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:24.311 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247835 2026-03-08T22:46:24.312 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:24.312 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247835 2026-03-08T22:46:24.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247835 2026-03-08T22:46:24.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247835' 2026-03-08T22:46:24.313 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:24.643 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247835 -lt 98784247835 2026-03-08T22:46:24.643 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:239: TEST_backfill_priority: ceph pg cancel-force-backfill 7.0 2026-03-08T22:46:24.858 INFO:tasks.workunit.client.0.vm06.stderr:instructing pg(s) [7.0] on osd.1 to cancel-force-backfill; 2026-03-08T22:46:24.858 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:46:24.871 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:240: TEST_backfill_priority: sleep 2 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:241: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:26.873 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:26.874 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:241: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:26.874 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:241: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:26.941 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:242: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:26.942 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:243: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:26.943 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:243: TEST_backfill_priority: jq '(.local_reservations.queues[].items[] | select(.item == "7.0")).prio' 2026-03-08T22:46:26.952 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:243: TEST_backfill_priority: PRIO=151 2026-03-08T22:46:26.952 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:244: TEST_backfill_priority: '[' 151 '!=' 151 ']' 2026-03-08T22:46:26.953 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:250: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:26.953 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:250: TEST_backfill_priority: jq '.local_reservations.in_progress[0].item' 2026-03-08T22:46:26.963 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:250: TEST_backfill_priority: eval 'ITEM="3.0"' 2026-03-08T22:46:26.963 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:250: TEST_backfill_priority: ITEM=3.0 2026-03-08T22:46:26.963 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:251: TEST_backfill_priority: '[' 3.0 '!=' 3.0 ']' 2026-03-08T22:46:26.963 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:256: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:26.963 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:256: TEST_backfill_priority: jq '.local_reservations.in_progress[0].prio' 2026-03-08T22:46:26.972 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:256: TEST_backfill_priority: PRIO=254 2026-03-08T22:46:26.972 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:257: TEST_backfill_priority: '[' 254 '!=' 254 ']' 2026-03-08T22:46:26.972 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:264: TEST_backfill_priority: ceph pg cancel-force-backfill 3.0 2026-03-08T22:46:27.191 INFO:tasks.workunit.client.0.vm06.stderr:instructing pg(s) [3.0] on osd.1 to cancel-force-backfill; 2026-03-08T22:46:27.191 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:46:27.203 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:265: TEST_backfill_priority: sleep 5 2026-03-08T22:46:32.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:266: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:266: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:32.205 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:266: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:32.265 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:32.266 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:32.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:269: TEST_backfill_priority: flush_pg_stats 2026-03-08T22:46:32.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:32.276 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:32.495 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:32.496 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:32.587 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836516 2026-03-08T22:46:32.587 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836516 2026-03-08T22:46:32.587 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516' 2026-03-08T22:46:32.587 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:32.587 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:32.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705699 2026-03-08T22:46:32.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705699 2026-03-08T22:46:32.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-38654705699' 2026-03-08T22:46:32.676 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:32.676 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574881 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574881 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-38654705699 2-55834574881' 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:32.851 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411360 2026-03-08T22:46:32.851 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411360 2026-03-08T22:46:32.851 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-38654705699 2-55834574881 3-77309411360' 2026-03-08T22:46:32.851 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:32.851 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:32.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247839 2026-03-08T22:46:32.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247839 2026-03-08T22:46:32.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-38654705699 2-55834574881 3-77309411360 4-98784247839' 2026-03-08T22:46:32.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:32.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836516 2026-03-08T22:46:32.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:32.939 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:32.939 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836516 2026-03-08T22:46:32.939 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:32.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836516 2026-03-08T22:46:32.940 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836516 2026-03-08T22:46:32.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836516' 2026-03-08T22:46:32.940 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:33.168 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836516 -lt 21474836516 2026-03-08T22:46:33.168 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:33.168 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705699 2026-03-08T22:46:33.168 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:33.169 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:33.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705699 2026-03-08T22:46:33.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:33.170 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705699 2026-03-08T22:46:33.170 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705699' 2026-03-08T22:46:33.170 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705699 2026-03-08T22:46:33.170 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:33.384 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705697 -lt 38654705699 2026-03-08T22:46:33.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:34.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:34.386 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:34.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705697 -lt 38654705699 2026-03-08T22:46:34.604 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:35.605 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:46:35.605 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:35.840 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705699 -lt 38654705699 2026-03-08T22:46:35.841 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:35.841 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574881 2026-03-08T22:46:35.841 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:35.842 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:35.842 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574881 2026-03-08T22:46:35.842 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:35.843 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574881 2026-03-08T22:46:35.843 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574881 2026-03-08T22:46:35.843 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574881' 2026-03-08T22:46:35.844 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:36.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574882 -lt 55834574881 2026-03-08T22:46:36.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:36.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411360 2026-03-08T22:46:36.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:36.070 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:36.070 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411360 2026-03-08T22:46:36.071 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:36.071 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411360 2026-03-08T22:46:36.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411360 2026-03-08T22:46:36.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411360' 2026-03-08T22:46:36.072 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:36.293 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411360 -lt 77309411360 2026-03-08T22:46:36.293 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:36.293 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247839 2026-03-08T22:46:36.293 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:36.294 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:36.294 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247839 2026-03-08T22:46:36.294 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:36.295 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247839 2026-03-08T22:46:36.295 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247839 2026-03-08T22:46:36.295 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247839' 2026-03-08T22:46:36.295 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:36.511 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247839 -lt 98784247839 2026-03-08T22:46:36.511 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:270: TEST_backfill_priority: ceph pg force-backfill 7.0 2026-03-08T22:46:36.713 INFO:tasks.workunit.client.0.vm06.stderr:instructing pg(s) [7.0] on osd.1 to force-backfill; 2026-03-08T22:46:36.713 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-08T22:46:36.726 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:271: TEST_backfill_priority: sleep 2 2026-03-08T22:46:38.727 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:273: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:38.727 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:38.727 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:38.727 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:38.727 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:38.727 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:38.728 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:38.728 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:273: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:46:38.728 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:273: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:274: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:38.795 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:275: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:38.796 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:275: TEST_backfill_priority: jq '(.local_reservations.queues[].items[] | select(.item == "3.0")).prio' 2026-03-08T22:46:38.805 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:275: TEST_backfill_priority: PRIO=151 2026-03-08T22:46:38.805 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:276: TEST_backfill_priority: '[' 151 '!=' 151 ']' 2026-03-08T22:46:38.805 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:282: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:38.805 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:282: TEST_backfill_priority: jq '.local_reservations.in_progress[0].item' 2026-03-08T22:46:38.814 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:282: TEST_backfill_priority: eval 'ITEM="7.0"' 2026-03-08T22:46:38.815 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:282: TEST_backfill_priority: ITEM=7.0 2026-03-08T22:46:38.815 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:283: TEST_backfill_priority: '[' 7.0 '!=' 7.0 ']' 2026-03-08T22:46:38.815 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:288: TEST_backfill_priority: cat td/osd-backfill-prio/out 2026-03-08T22:46:38.815 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:288: TEST_backfill_priority: jq '.local_reservations.in_progress[0].prio' 2026-03-08T22:46:38.824 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:288: TEST_backfill_priority: PRIO=254 2026-03-08T22:46:38.824 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:289: TEST_backfill_priority: '[' 254 '!=' 254 ']' 2026-03-08T22:46:38.824 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:296: TEST_backfill_priority: ceph osd unset noout 2026-03-08T22:46:39.143 INFO:tasks.workunit.client.0.vm06.stderr:noout is unset 2026-03-08T22:46:39.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:297: TEST_backfill_priority: ceph osd unset nobackfill 2026-03-08T22:46:39.454 INFO:tasks.workunit.client.0.vm06.stderr:nobackfill is unset 2026-03-08T22:46:39.469 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:299: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:299: TEST_backfill_priority: wait_for_clean 'CEPH_ARGS='\'''\'' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations' 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local 'cmd=CEPH_ARGS='\'''\'' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations' 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:46:39.470 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:46:39.471 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:46:39.471 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:46:39.471 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:46:39.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:46:39.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:46:39.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:46:39.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:46:39.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:39.536 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:4' 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:39.753 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:39.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836519 2026-03-08T22:46:39.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836519 2026-03-08T22:46:39.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519' 2026-03-08T22:46:39.833 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:39.833 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:39.910 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705702 2026-03-08T22:46:39.910 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705702 2026-03-08T22:46:39.910 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-38654705702' 2026-03-08T22:46:39.910 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:39.910 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:39.991 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574885 2026-03-08T22:46:39.991 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574885 2026-03-08T22:46:39.991 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-38654705702 2-55834574885' 2026-03-08T22:46:39.991 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:39.991 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:40.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411363 2026-03-08T22:46:40.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411363 2026-03-08T22:46:40.088 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-38654705702 2-55834574885 3-77309411363' 2026-03-08T22:46:40.088 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:40.088 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:40.167 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247842 2026-03-08T22:46:40.167 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247842 2026-03-08T22:46:40.167 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-38654705702 2-55834574885 3-77309411363 4-98784247842' 2026-03-08T22:46:40.167 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:40.167 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836519 2026-03-08T22:46:40.167 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:40.168 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:40.168 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836519 2026-03-08T22:46:40.168 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:40.169 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836519 2026-03-08T22:46:40.169 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836519 2026-03-08T22:46:40.169 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836519' 2026-03-08T22:46:40.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:40.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836519 2026-03-08T22:46:40.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:41.412 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:41.412 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:41.697 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836519 -lt 21474836519 2026-03-08T22:46:41.697 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:41.698 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705702 2026-03-08T22:46:41.698 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:41.699 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:41.699 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705702 2026-03-08T22:46:41.699 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:41.700 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 38654705702 2026-03-08T22:46:41.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705702 2026-03-08T22:46:41.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705702' 2026-03-08T22:46:41.700 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:41.958 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705702 -lt 38654705702 2026-03-08T22:46:41.958 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:41.959 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-55834574885 2026-03-08T22:46:41.959 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:41.960 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:41.960 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-55834574885 2026-03-08T22:46:41.960 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:41.961 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 55834574885 2026-03-08T22:46:41.961 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574885 2026-03-08T22:46:41.961 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 55834574885' 2026-03-08T22:46:41.961 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:42.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574885 -lt 55834574885 2026-03-08T22:46:42.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:42.220 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-77309411363 2026-03-08T22:46:42.220 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:42.221 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:42.222 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-77309411363 2026-03-08T22:46:42.222 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:42.223 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 77309411363 2026-03-08T22:46:42.223 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411363 2026-03-08T22:46:42.223 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 77309411363' 2026-03-08T22:46:42.223 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411364 -lt 77309411363 2026-03-08T22:46:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:42.444 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-98784247842 2026-03-08T22:46:42.444 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:42.447 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:42.447 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:42.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-98784247842 2026-03-08T22:46:42.455 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 98784247842 2026-03-08T22:46:42.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247842 2026-03-08T22:46:42.455 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 98784247842' 2026-03-08T22:46:42.455 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:42.706 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247843 -lt 98784247842 2026-03-08T22:46:42.706 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:46:42.706 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:42.706 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 3 == 0 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:43.045 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:43.291 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:43.291 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:43.291 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:43.292 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' -1 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=0 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:43.603 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:43.658 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:43.658 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:43.659 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:43.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:43.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:43.779 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:43.780 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:43.780 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:43.780 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:43.780 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:43.780 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:43.780 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:43.989 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:43.989 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:43.989 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:43.989 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:44.373 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:44.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=28171545 2026-03-08T22:46:44.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 28171545 '!=' null 2026-03-08T22:46:44.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:44.700 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:44.700 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:44.700 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:44.759 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:44.767 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:44.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:44.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:45.090 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:45.090 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:45.090 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:45.091 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:45.389 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=28171545 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 28171545 '!=' null 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:45.760 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:45.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:45.869 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:45.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:46.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:46.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:46.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:46.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:46.385 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:46.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=21033139 2026-03-08T22:46:46.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 21033139 '!=' null 2026-03-08T22:46:46.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:46.698 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:46.698 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:46.698 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:46.760 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:46.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:46.869 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:46.869 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:46.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:46.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:46.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:46.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:46.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:46.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:47.098 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:47.098 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:47.098 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:47.099 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:47.424 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:47.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=21033139 2026-03-08T22:46:47.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 21033139 '!=' null 2026-03-08T22:46:47.735 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:47.736 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:47.736 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:47.736 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 151, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "item": "7.0", 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 254, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:47.796 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:47.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:47.909 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:48.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:48.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:48.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:48.152 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:48.451 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:48.770 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=61594157 2026-03-08T22:46:48.771 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 61594157 '!=' null 2026-03-08T22:46:48.771 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:48.771 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:48.771 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:48.771 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:48.836 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:48.837 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:48.837 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:48.837 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:48.837 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:48.843 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:48.944 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:48.944 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:48.944 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:48.945 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:48.945 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:48.945 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:48.945 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:48.945 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:49.203 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:49.203 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:49.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:49.204 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:49.534 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:49.826 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=61594157 2026-03-08T22:46:49.826 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 61594157 '!=' null 2026-03-08T22:46:49.826 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:49.826 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:49.826 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:49.826 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:49.884 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:49.884 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:49.884 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:49.884 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:49.884 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:49.884 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:49.885 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:49.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:49.994 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:49.994 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:49.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:49.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:49.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:49.994 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:49.995 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:49.995 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:50.248 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:50.248 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:50.248 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:50.249 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:50.550 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:50.861 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=41936332 2026-03-08T22:46:50.861 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 41936332 '!=' null 2026-03-08T22:46:50.861 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:50.861 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:50.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:50.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:50.926 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:50.926 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:50.927 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:50.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:51.038 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:51.039 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:51.276 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:51.277 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:51.277 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:51.277 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:51.583 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:51.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=41936332 2026-03-08T22:46:51.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 41936332 '!=' null 2026-03-08T22:46:51.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:51.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:51.883 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:51.883 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:51.973 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:51.974 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:51.991 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:52.095 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:52.095 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:52.095 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:52.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:52.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:52.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:52.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:52.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:52.344 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:46:52.344 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:52.344 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:52.344 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 3 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:52.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:52.680 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:52.680 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:52.992 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=85184393 2026-03-08T22:46:52.992 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 85184393 '!=' null 2026-03-08T22:46:52.993 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:52.993 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:52.993 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:52.993 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.058 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:53.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:53.169 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:53.169 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:53.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:53.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:53.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:53.169 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:53.170 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:53.170 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:53.387 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:46:53.388 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:53.388 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:53.388 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 3 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 0 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=1 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:53.682 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:53.750 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:53.751 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:53.751 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:53.751 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:53.751 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:53.751 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:53.751 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:53.759 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:53.860 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:53.860 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:53.860 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:53.860 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:53.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:53.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:53.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:53.861 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:54.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:46:54.087 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:54.087 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:54.087 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 3 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:54.397 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:54.708 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=85184085 2026-03-08T22:46:54.709 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 85184085 '!=' null 2026-03-08T22:46:54.709 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:54.709 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:54.709 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:54.709 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [ 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "priority": 110, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "items": [ 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "item": "3.0", 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 151, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:54.771 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:54.780 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:54.882 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:55.121 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:46:55.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:55.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:55.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 3 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:55.440 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:55.441 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:55.441 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:55.748 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=85184085 2026-03-08T22:46:55.748 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 85184085 '!=' null 2026-03-08T22:46:55.748 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:55.748 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:55.748 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:55.748 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:55.814 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:55.823 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:55.925 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:55.925 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:55.925 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:55.925 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:55.925 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:55.925 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:55.926 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:55.926 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:56.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:46:56.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:56.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:56.151 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:56.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 3 2026-03-08T22:46:56.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:46:56.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:56.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:56.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:56.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:56.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:56.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:56.456 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:56.782 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=43247032 2026-03-08T22:46:56.782 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 43247032 '!=' null 2026-03-08T22:46:56.782 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:56.782 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:56.782 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:56.782 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:56.847 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:56.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:56.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:57.200 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:46:57.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:57.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:57.200 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 3 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:57.503 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:57.800 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=43247032 2026-03-08T22:46:57.801 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 43247032 '!=' null 2026-03-08T22:46:57.801 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:57.801 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:57.801 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:57.801 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:57.865 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:57.874 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:57.975 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:57.976 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:58.214 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:46:58.214 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:58.214 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:58.214 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 3 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:46:58.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:46:58.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=87804881 2026-03-08T22:46:58.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 87804881 '!=' null 2026-03-08T22:46:58.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:46:58.856 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:58.856 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:58.856 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:58.920 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:58.930 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:59.031 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:59.264 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:46:59.264 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:59.264 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:59.264 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:59.556 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:46:59.557 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 1 2026-03-08T22:46:59.557 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:46:59.557 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=2 2026-03-08T22:46:59.557 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:59.557 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:46:59.557 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:46:59.613 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:46:59.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:46:59.722 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:46:59.722 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:59.722 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:59.722 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:59.722 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:59.722 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:59.723 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:59.723 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:59.955 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:46:59.956 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:59.956 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:59.956 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:00.286 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:00.587 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=87804881 2026-03-08T22:47:00.588 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 87804881 '!=' null 2026-03-08T22:47:00.588 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:00.588 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:00.588 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:00.588 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:00.662 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:00.670 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:00.772 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:00.772 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:01.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:01.007 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:01.008 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:01.014 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:01.323 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:01.324 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:01.629 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=44557184 2026-03-08T22:47:01.629 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 44557184 '!=' null 2026-03-08T22:47:01.629 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:01.629 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:01.629 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:01.629 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [ 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "item": "1.0", 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "prio": 110, 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "can_preempt": true 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:01.689 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:01.707 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:01.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:01.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:01.808 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:01.809 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:01.809 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:01.809 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:01.809 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:01.809 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:02.047 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:02.048 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:02.048 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:02.048 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:02.364 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:02.365 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:02.365 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:02.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=100474872 2026-03-08T22:47:02.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 100474872 '!=' null 2026-03-08T22:47:02.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:02.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:02.696 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:02.696 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:02.758 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:02.759 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:02.768 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:02.869 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:02.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:03.107 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:03.107 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:03.107 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:03.107 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:03.411 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:03.715 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=100474872 2026-03-08T22:47:03.715 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 100474872 '!=' null 2026-03-08T22:47:03.715 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:03.715 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:03.715 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:03.715 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:03.779 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:03.780 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:03.789 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:03.890 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:03.890 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:03.890 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:03.890 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:03.890 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:03.890 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:03.891 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:03.891 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:04.124 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:04.124 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:04.125 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:04.125 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:04.457 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:04.759 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=91735532 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 91735532 '!=' null 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:04.820 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:04.829 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:04.930 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:04.930 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:04.930 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:04.930 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:04.930 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:04.930 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:04.931 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:04.931 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:05.162 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:05.163 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:05.163 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:05.163 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:05.461 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:05.766 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=91735532 2026-03-08T22:47:05.766 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 91735532 '!=' null 2026-03-08T22:47:05.766 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:05.766 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:05.766 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:05.766 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:05.828 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:05.837 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:05.938 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:06.170 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:06.170 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:06.171 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:06.171 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:06.478 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:06.779 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=47178150 2026-03-08T22:47:06.779 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 47178150 '!=' null 2026-03-08T22:47:06.779 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:06.779 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:06.779 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:06.779 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:06.838 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:06.847 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:06.948 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:06.948 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:06.948 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:06.948 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:06.948 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:06.948 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:06.949 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:06.949 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:07.178 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:07.178 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:07.179 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:07.179 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:07.466 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:07.769 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=47178150 2026-03-08T22:47:07.770 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 47178150 '!=' null 2026-03-08T22:47:07.770 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:07.770 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:07.770 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:07.770 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:07.825 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:07.834 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:07.936 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:08.165 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:47:08.166 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:08.166 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:08.166 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:08.463 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 3 2026-03-08T22:47:08.463 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' 2 2026-03-08T22:47:08.463 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:47:08.463 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:47:08.463 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:47:08.463 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:47:08.464 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:47:08.464 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:47:08.464 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:47:08.759 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=47178466 2026-03-08T22:47:08.759 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 47178466 '!=' null 2026-03-08T22:47:08.759 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:47:08.759 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 'CEPH_ARGS='\'''\''' ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:08.759 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: CEPH_ARGS= 2026-03-08T22:47:08.759 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_recovery_reservations 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "local_reservations": { 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "remote_reservations": { 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "max_allowed": 1, 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "min_priority": 0, 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "queues": [], 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: "in_progress": [] 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:08.825 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:47:08.926 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:08.927 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:09.143 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:47:09.143 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:09.143 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:09.143 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:09.425 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 3 2026-03-08T22:47:09.425 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:47:09.425 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:47:09.425 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:301: TEST_backfill_priority: ceph pg dump pgs 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stdout:7.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:46:47.999919+0000 86'150 111:379 [1,4] 1 [1,4] 1 0'0 2026-03-08T22:45:22.591405+0000 0'0 2026-03-08T22:45:22.591405+0000 0 0 periodic scrub scheduled @ 2026-03-10T06:35:38.802906+0000 0 0 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stdout:3.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:46:55.325639+0000 86'150 111:407 [1,2] 1 [1,2] 1 0'0 2026-03-08T22:45:15.214887+0000 0'0 2026-03-08T22:45:15.214887+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:06:52.674849+0000 0 0 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stdout:1.0 50 0 0 0 0 524288000 0 0 50 100 50 active+clean 2026-03-08T22:47:02.549287+0000 86'150 111:421 [1,2] 1 [1,2] 1 0'0 2026-03-08T22:45:11.963744+0000 0'0 2026-03-08T22:45:11.963744+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:30:09.702504+0000 0 0 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stdout: 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:47:09.641 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:47:09.655 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:303: TEST_backfill_priority: get_asok_path osd.1 2026-03-08T22:47:09.655 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:47:09.655 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:47:09.655 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:09.655 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:09.655 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:47:09.656 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19257/ceph-osd.1.asok 2026-03-08T22:47:09.656 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:303: TEST_backfill_priority: CEPH_ARGS= 2026-03-08T22:47:09.656 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:303: TEST_backfill_priority: ceph --admin-daemon /tmp/ceph-asok.19257/ceph-osd.1.asok dump_pgstate_history 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "pgs": [ 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "pg": "7.0", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "currently": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "history": [ 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "59", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.601160+0000", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:22.601164+0000" 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.601168+0000", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:22.603587+0000" 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.603587+0000", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:22.603616+0000" 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.603616+0000", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:22.603622+0000" 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.603622+0000", 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:22.700882+0000" 2026-03-08T22:47:09.717 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.601166+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:22.700886+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.700962+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.304148+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.304149+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.304183+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.304183+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.882548+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.700887+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.882565+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.601165+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.882603+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:22.601158+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.882606+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "59", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.882606+0000", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.882801+0000" 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "85", 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.718 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.882804+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.882810+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.882813+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.889383+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.889383+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.889418+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.889418+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:23.889424+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.889424+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:24.011047+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.882811+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:24.011051+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:24.011112+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:24.131996+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:24.131996+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:24.132015+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:24.132016+0000", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156935+0000" 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.719 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:24.011052+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156949+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.882810+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156979+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:23.882801+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156981+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "85", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.156982+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.157098+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "91", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.157100+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.157107+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.157112+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.160112+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.160113+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.160151+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.160151+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.160158+0000" 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.160158+0000", 2026-03-08T22:47:09.720 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.253040+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.157110+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.253057+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.253097+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.254340+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.254340+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.254366+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.254366+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.606522+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.253058+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.606539+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.157107+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.606592+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.157098+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.606594+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "92", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.606594+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.606795+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "93", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.606797+0000", 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.606804+0000" 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.721 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.606809+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.608841+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.608842+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.608880+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.606806+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.608885+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/WaitActingChange", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.608885+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.749731+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.606804+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.749816+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.606795+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.749819+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "93", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.749819+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.749924+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "106", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.749927+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.749933+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.749938+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.750343+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.750344+0000", 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.750386+0000" 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.722 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.750386+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.750393+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.750394+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.851900+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.749935+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.851976+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.852126+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.855329+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitLocalBackfillReserved", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.855330+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.855433+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitRemoteBackfillReserved", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.855433+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:00.855819+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Backfilling", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.855820+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:24.808860+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/NotBackfilling", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:24.808860+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:24.809492+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitLocalBackfillReserved", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:24.809492+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:36.663997+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitRemoteBackfillReserved", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:36.663998+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:36.664699+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Backfilling", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:36.664699+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.798354+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:47.798355+0000", 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.798451+0000" 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.723 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:47.798452+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.899630+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.851977+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.899647+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.749933+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.899676+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:00.749925+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.899678+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "106", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:47.899678+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.899864+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "pg": "3.0", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "currently": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "history": [ 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "39", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.330834+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:15.330840+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.330842+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:15.337874+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.337874+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:15.337924+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.337924+0000", 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:15.337930+0000" 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.724 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.337931+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:15.437101+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.330841+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:15.437105+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.437216+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.318885+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.318885+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.318913+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.318913+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.936569+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.437106+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.936578+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.330840+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.936603+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:15.330833+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.936605+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "39", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.936605+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.936692+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "83", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.936693+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.936698+0000" 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.936701+0000", 2026-03-08T22:47:09.725 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.939562+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.939563+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.939619+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.939620+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:16.939630+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.939630+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:17.038682+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.936700+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:17.038687+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:17.038772+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:17.040354+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:17.040354+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:17.040388+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:17.040388+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.155729+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:17.038688+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.155741+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.936698+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.155765+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:16.936692+0000", 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.155767+0000" 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.726 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "85", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.155767+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.155930+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "98", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.155933+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.155942+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.155949+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156027+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.156027+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156069+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.156070+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.156079+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.156079+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.253046+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.155946+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.253051+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.253095+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.254198+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.254199+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:41.254216+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.254216+0000", 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.585479+0000" 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.727 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.253052+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.585573+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.155942+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.585647+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:41.155930+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.585649+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "98", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.585649+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.585855+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "100", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.585857+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.585863+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.585867+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.586269+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.586269+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.586304+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.585866+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.586310+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/WaitActingChange", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.586310+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649186+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.585864+0000", 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649208+0000" 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.728 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.585855+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649211+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "100", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649212+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649387+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "108", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649390+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649399+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649406+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649824+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649824+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649874+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649875+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.649885+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649886+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.753767+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649403+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:14.753776+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.754014+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:15.611043+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitLocalBackfillReserved", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:15.611043+0000", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:24.808871+0000" 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitRemoteBackfillReserved", 2026-03-08T22:47:09.729 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:24.808872+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:24.809432+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Backfilling", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:24.809432+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:36.664030+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/NotBackfilling", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:36.664030+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:36.664109+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitLocalBackfillReserved", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:36.664110+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.798631+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitRemoteBackfillReserved", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:47.798635+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:47.799253+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Backfilling", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:47.799254+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.166419+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:55.166420+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.166475+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:55.166475+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.222050+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.753776+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.222064+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649400+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.222207+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:14.649387+0000", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.222208+0000" 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "108", 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.730 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:55.222210+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.222378+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "pg": "1.0", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "currently": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "history": [ 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "25", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Initial", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.010901+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.010992+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "25", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.010993+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.011007+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "28", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.011009+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.011013+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.011016+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.023211+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.023218+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.023260+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.023260+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.023266+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.023267+0000", 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.127876+0000" 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.731 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.011015+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.127884+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.128985+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.823135+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.823136+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:12.823161+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.823161+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.426071+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.127885+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.426088+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.011013+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.426137+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:12.011007+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.426139+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "29", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.426139+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.426293+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "95", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.426307+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.426313+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.426317+0000", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.428573+0000" 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.732 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.428574+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.428736+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.428736+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.428743+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.428744+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.515180+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.426315+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.515185+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.515262+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.517632+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.517633+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:45:13.517667+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.517667+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.133267+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.515186+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.133285+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.426313+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.133356+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:45:13.426294+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.133357+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "95", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.133358+0000", 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.133531+0000" 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.733 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "96", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.133533+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.133539+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.133542+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.135834+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.135834+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.135879+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.133541+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.135883+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/WaitActingChange", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.135883+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.190165+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.133539+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.190187+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.133532+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.190190+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "96", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.190192+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.190311+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "110", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Start", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.190313+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.190320+0000" 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetInfo", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.190324+0000", 2026-03-08T22:47:09.734 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.192413+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetLog", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.192414+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.192461+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/GetMissing", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.192461+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.192471+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering/WaitUpThru", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.192471+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.343450+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Peering", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.190322+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.343456+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Activating", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.343607+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:08.459671+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitLocalBackfillReserved", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.459672+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.166602+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/WaitRemoteBackfillReserved", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:55.166602+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:46:55.167125+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Backfilling", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:55.167125+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.389072+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Recovered", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:47:02.389073+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.389154+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active/Clean", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:47:02.389155+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.417749+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary/Active", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.343456+0000", 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.417762+0000" 2026-03-08T22:47:09.735 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started/Primary", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.190320+0000", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.417791+0000" 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Started", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:46:08.190311+0000", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.417793+0000" 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "epoch": "110", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "states": [ 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "state": "Reset", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "enter": "2026-03-08T22:47:02.417794+0000", 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: "exit": "2026-03-08T22:47:02.417976+0000" 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stdout:TEST PASSED 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:305: TEST_backfill_priority: '[' 0 '!=' 0 ']' 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:309: TEST_backfill_priority: echo TEST PASSED 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:312: TEST_backfill_priority: delete_pool test1 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test1 2026-03-08T22:47:09.736 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test1 test1 --yes-i-really-really-mean-it 2026-03-08T22:47:10.005 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test1' does not exist 2026-03-08T22:47:10.018 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:313: TEST_backfill_priority: delete_pool test3 2026-03-08T22:47:10.019 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test3 2026-03-08T22:47:10.019 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test3 test3 --yes-i-really-really-mean-it 2026-03-08T22:47:10.308 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test3' does not exist 2026-03-08T22:47:10.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:314: TEST_backfill_priority: delete_pool test7 2026-03-08T22:47:10.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=test7 2026-03-08T22:47:10.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete test7 test7 --yes-i-really-really-mean-it 2026-03-08T22:47:10.612 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test7' does not exist 2026-03-08T22:47:10.626 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:315: TEST_backfill_priority: kill_daemons td/osd-backfill-prio 2026-03-08T22:47:10.626 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:10.626 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:10.627 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:10.627 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:10.627 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:316: TEST_backfill_priority: return 0 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-prio.sh:43: run: teardown td/osd-backfill-prio 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-prio 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-prio KILL 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:15.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:15.966 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:15.966 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:15.967 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:15.967 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:15.967 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:15.967 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:15.968 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:15.968 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:15.968 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:15.969 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:15.969 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:15.969 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:15.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:47:15.970 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-prio 2026-03-08T22:47:15.997 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:15.997 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:15.997 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:47:15.997 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19257 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-backfill-prio 0 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-prio 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:47:15.998 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-prio KILL 2026-03-08T22:47:15.999 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:15.999 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:15.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:15.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:15.999 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:16.000 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:16.000 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:16.001 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:16.001 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:16.002 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:16.002 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:16.002 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:16.003 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:16.003 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:16.003 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:16.003 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:16.004 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:16.005 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:47:16.005 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-prio 2026-03-08T22:47:16.005 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:16.005 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.005 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19257 2026-03-08T22:47:16.006 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19257 2026-03-08T22:47:16.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:16.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:16.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:47:16.007 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:47:16.007 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:47:16.014 INFO:tasks.workunit:Running workunit osd-backfill/osd-backfill-recovery-log.sh... 2026-03-08T22:47:16.014 DEBUG:teuthology.orchestra.run.vm06:workunit test osd-backfill/osd-backfill-recovery-log.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh 2026-03-08T22:47:16.063 INFO:tasks.workunit.client.0.vm06.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-backfill-recovery-log 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:21: run: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:22: run: shift 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:25: run: export CEPH_MON=127.0.0.1:7129 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:25: run: CEPH_MON=127.0.0.1:7129 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:26: run: export CEPH_ARGS 2026-03-08T22:47:16.066 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:27: run: uuidgen 2026-03-08T22:47:16.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:27: run: CEPH_ARGS+='--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none ' 2026-03-08T22:47:16.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:28: run: CEPH_ARGS+='--mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 ' 2026-03-08T22:47:16.067 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:29: run: CEPH_ARGS+='--osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:16.067 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:31: run: set 2026-03-08T22:47:16.067 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:31: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:47:16.068 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:31: run: local 'funcs=TEST_backfill_log_1 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:TEST_backfill_log_2 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:TEST_recovery_1 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:TEST_recovery_2' 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:32: run: for func in $funcs 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:33: run: setup td/osd-backfill-recovery-log 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-backfill-recovery-log 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-recovery-log KILL 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:16.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:16.070 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:16.070 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:16.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:16.071 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:16.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:16.071 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:16.071 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:16.072 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:16.072 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:16.072 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:16.072 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:16.073 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:16.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:47:16.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-recovery-log 2026-03-08T22:47:16.074 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:16.074 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.074 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.42776 2026-03-08T22:47:16.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:16.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:16.075 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-backfill-recovery-log 2026-03-08T22:47:16.076 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:47:16.076 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.076 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.076 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.42776 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-backfill-recovery-log 1' TERM HUP INT 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:34: run: TEST_backfill_log_1 td/osd-backfill-recovery-log 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:106: TEST_backfill_log_1: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:108: TEST_backfill_log_1: _common_test td/osd-backfill-recovery-log '--osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2 8 150 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:41: _common_test: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:42: _common_test: local 'extra_opts=--osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:43: _common_test: local loglen=2 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:44: _common_test: local dupslen=8 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:45: _common_test: local objects=150 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:46: _common_test: local moreobjects=0 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:48: _common_test: local OSDS=6 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:50: _common_test: run_mon td/osd-backfill-recovery-log a 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-backfill-recovery-log/a 2026-03-08T22:47:16.077 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-backfill-recovery-log/a --run-dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:16.121 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-backfill-recovery-log/a '--log-file=td/osd-backfill-recovery-log/$name.log' '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --mon-cluster-log-file=td/osd-backfill-recovery-log/log --run-dir=td/osd-backfill-recovery-log '--pid-file=td/osd-backfill-recovery-log/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:47:16.152 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:47:16.152 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:47:16.152 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:47:16.152 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:47:16.152 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:47:16.153 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:47:16.153 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:47:16.153 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:16.153 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:16.156 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:16.156 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.157 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.157 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.42776/ceph-mon.a.asok 2026-03-08T22:47:16.157 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:47:16.157 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.42776/ceph-mon.a.asok config get fsid 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:16.226 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:16.227 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:16.227 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.227 INFO:tasks.workunit.client.0.vm06.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.227 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.42776/ceph-mon.a.asok 2026-03-08T22:47:16.227 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:47:16.227 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.42776/ceph-mon.a.asok config get mon_host 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:51: _common_test: run_mgr td/osd-backfill-recovery-log x 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-backfill-recovery-log/x 2026-03-08T22:47:16.296 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:47:16.430 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:47:16.430 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:16.430 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:16.430 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:16.430 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.430 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.431 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:16.431 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:47:16.431 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-backfill-recovery-log/x '--log-file=td/osd-backfill-recovery-log/$name.log' '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --run-dir=td/osd-backfill-recovery-log '--pid-file=td/osd-backfill-recovery-log/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:47:16.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:52: _common_test: export CEPH_ARGS 2026-03-08T22:47:16.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:53: _common_test: export 'EXTRA_OPTS= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:16.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:53: _common_test: EXTRA_OPTS=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:16.449 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: expr 6 - 1 2026-03-08T22:47:16.450 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: seq 0 5 2026-03-08T22:47:16.450 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:57: _common_test: run_osd td/osd-backfill-recovery-log 0 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-recovery-log/0 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-recovery-log/0' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-recovery-log/0/journal' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-recovery-log' 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:16.451 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-recovery-log/$name.log' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-recovery-log/$name.pid' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:16.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-recovery-log/0 2026-03-08T22:47:16.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:16.454 INFO:tasks.workunit.client.0.vm06.stdout:add osd0 778c2ec1-3b81-4c2c-9d42-433fb7a89085 2026-03-08T22:47:16.454 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=778c2ec1-3b81-4c2c-9d42-433fb7a89085 2026-03-08T22:47:16.454 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 778c2ec1-3b81-4c2c-9d42-433fb7a89085' 2026-03-08T22:47:16.454 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:16.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB0/K1pc9K7GxAAtG9CLbocliVin/axgPA++A== 2026-03-08T22:47:16.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB0/K1pc9K7GxAAtG9CLbocliVin/axgPA++A=="}' 2026-03-08T22:47:16.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 778c2ec1-3b81-4c2c-9d42-433fb7a89085 -i td/osd-backfill-recovery-log/0/new.json 2026-03-08T22:47:16.589 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:16.603 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-recovery-log/0/new.json 2026-03-08T22:47:16.604 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/0 --osd-journal=td/osd-backfill-recovery-log/0/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB0/K1pc9K7GxAAtG9CLbocliVin/axgPA++A== --osd-uuid 778c2ec1-3b81-4c2c-9d42-433fb7a89085 2026-03-08T22:47:16.624 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:16.617+0000 7f59c39e58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:16.626 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:16.621+0000 7f59c39e58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:16.628 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:16.621+0000 7f59c39e58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:16.628 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:16.621+0000 7f59c39e58c0 -1 bdev(0x564fe0feec00 td/osd-backfill-recovery-log/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:16.628 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:16.621+0000 7f59c39e58c0 -1 bluestore(td/osd-backfill-recovery-log/0) _read_fsid unparsable uuid 2026-03-08T22:47:18.895 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-recovery-log/0/keyring 2026-03-08T22:47:18.895 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:18.896 INFO:tasks.workunit.client.0.vm06.stdout:adding osd0 key to auth repository 2026-03-08T22:47:18.896 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:47:18.896 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-recovery-log/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:19.034 INFO:tasks.workunit.client.0.vm06.stdout:start osd.0 2026-03-08T22:47:19.034 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:47:19.034 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/0 --osd-journal=td/osd-backfill-recovery-log/0/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:19.034 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:19.035 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:19.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:19.050 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:19.045+0000 7f4c750758c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:19.054 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:19.049+0000 7f4c750758c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:19.055 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:19.049+0000 7f4c750758c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:19.269 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:19.483 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:19.509 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:19.505+0000 7f4c750758c0 -1 Falling back to public interface 2026-03-08T22:47:20.482 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:20.477+0000 7f4c750758c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:47:20.484 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:20.484 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:20.484 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:20.484 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:20.484 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:20.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:20.706 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:21.707 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:21.707 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:21.707 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:21.707 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:21.707 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:21.707 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:21.946 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:22.164 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:22.161+0000 7f4c7082e640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:47:22.947 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:47:22.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:22.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:22.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:22.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:22.947 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:23.156 INFO:tasks.workunit.client.0.vm06.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2334881669,v1:127.0.0.1:6803/2334881669] [v2:127.0.0.1:6804/2334881669,v1:127.0.0.1:6805/2334881669] exists,up 778c2ec1-3b81-4c2c-9d42-433fb7a89085 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:57: _common_test: run_osd td/osd-backfill-recovery-log 1 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-recovery-log/1 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-recovery-log/1' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-recovery-log/1/journal' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:23.157 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-recovery-log' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-recovery-log/$name.log' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-recovery-log/$name.pid' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:23.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-recovery-log/1 2026-03-08T22:47:23.159 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:23.160 INFO:tasks.workunit.client.0.vm06.stdout:add osd1 d6efce08-10ce-46e9-aeef-529eb5ca8a03 2026-03-08T22:47:23.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=d6efce08-10ce-46e9-aeef-529eb5ca8a03 2026-03-08T22:47:23.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 d6efce08-10ce-46e9-aeef-529eb5ca8a03' 2026-03-08T22:47:23.160 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:23.172 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB7/K1pkfM8ChAA66lGxGhTIJeSxgT0MMiG7Q== 2026-03-08T22:47:23.172 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB7/K1pkfM8ChAA66lGxGhTIJeSxgT0MMiG7Q=="}' 2026-03-08T22:47:23.172 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new d6efce08-10ce-46e9-aeef-529eb5ca8a03 -i td/osd-backfill-recovery-log/1/new.json 2026-03-08T22:47:23.377 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:23.388 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-recovery-log/1/new.json 2026-03-08T22:47:23.389 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/1 --osd-journal=td/osd-backfill-recovery-log/1/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB7/K1pkfM8ChAA66lGxGhTIJeSxgT0MMiG7Q== --osd-uuid d6efce08-10ce-46e9-aeef-529eb5ca8a03 2026-03-08T22:47:23.406 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:23.401+0000 7f70222638c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:23.408 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:23.401+0000 7f70222638c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:23.409 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:23.405+0000 7f70222638c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:23.409 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:23.405+0000 7f70222638c0 -1 bdev(0x55bf2e413c00 td/osd-backfill-recovery-log/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:23.409 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:23.405+0000 7f70222638c0 -1 bluestore(td/osd-backfill-recovery-log/1) _read_fsid unparsable uuid 2026-03-08T22:47:26.437 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-recovery-log/1/keyring 2026-03-08T22:47:26.437 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:26.437 INFO:tasks.workunit.client.0.vm06.stdout:adding osd1 key to auth repository 2026-03-08T22:47:26.437 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:47:26.438 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-recovery-log/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:26.721 INFO:tasks.workunit.client.0.vm06.stdout:start osd.1 2026-03-08T22:47:26.721 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:47:26.721 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/1 --osd-journal=td/osd-backfill-recovery-log/1/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:26.721 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:26.722 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:26.726 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:26.738 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:26.733+0000 7fa8db5e38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:26.738 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:26.733+0000 7fa8db5e38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:26.740 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:26.733+0000 7fa8db5e38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:26.950 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:26.951 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:27.174 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:27.453 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:27.449+0000 7fa8db5e38c0 -1 Falling back to public interface 2026-03-08T22:47:28.175 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:28.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:28.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:28.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:28.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:28.175 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:28.394 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:28.419 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:28.413+0000 7fa8db5e38c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:47:29.395 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:29.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:29.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:29.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:29.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:29.395 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:29.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:30.139 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:30.133+0000 7fa8d6d9c640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:47:30.618 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:47:30.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:30.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:30.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:30.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:30.619 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3149655235,v1:127.0.0.1:6811/3149655235] [v2:127.0.0.1:6812/3149655235,v1:127.0.0.1:6813/3149655235] exists,up d6efce08-10ce-46e9-aeef-529eb5ca8a03 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:57: _common_test: run_osd td/osd-backfill-recovery-log 2 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:47:30.866 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-recovery-log/2 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-recovery-log/2' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-recovery-log/2/journal' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-recovery-log' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:30.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-recovery-log/$name.log' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-recovery-log/$name.pid' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:30.868 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-recovery-log/2 2026-03-08T22:47:30.869 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:30.870 INFO:tasks.workunit.client.0.vm06.stdout:add osd2 ceac4932-41ee-42a3-93c5-e9b3b7816782 2026-03-08T22:47:30.870 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ceac4932-41ee-42a3-93c5-e9b3b7816782 2026-03-08T22:47:30.870 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 ceac4932-41ee-42a3-93c5-e9b3b7816782' 2026-03-08T22:47:30.870 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:30.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCC/K1pRfSeNBAA50XWke2t8G5KWMWVlYzjfw== 2026-03-08T22:47:30.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCC/K1pRfSeNBAA50XWke2t8G5KWMWVlYzjfw=="}' 2026-03-08T22:47:30.883 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ceac4932-41ee-42a3-93c5-e9b3b7816782 -i td/osd-backfill-recovery-log/2/new.json 2026-03-08T22:47:31.099 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:31.113 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-recovery-log/2/new.json 2026-03-08T22:47:31.114 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/2 --osd-journal=td/osd-backfill-recovery-log/2/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCC/K1pRfSeNBAA50XWke2t8G5KWMWVlYzjfw== --osd-uuid ceac4932-41ee-42a3-93c5-e9b3b7816782 2026-03-08T22:47:31.131 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:31.125+0000 7fda5390b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:31.133 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:31.129+0000 7fda5390b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:31.134 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:31.129+0000 7fda5390b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:31.134 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:31.129+0000 7fda5390b8c0 -1 bdev(0x555e85acdc00 td/osd-backfill-recovery-log/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:31.134 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:31.129+0000 7fda5390b8c0 -1 bluestore(td/osd-backfill-recovery-log/2) _read_fsid unparsable uuid 2026-03-08T22:47:33.429 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-recovery-log/2/keyring 2026-03-08T22:47:33.429 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:33.430 INFO:tasks.workunit.client.0.vm06.stdout:adding osd2 key to auth repository 2026-03-08T22:47:33.430 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:47:33.430 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-recovery-log/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:33.712 INFO:tasks.workunit.client.0.vm06.stdout:start osd.2 2026-03-08T22:47:33.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:47:33.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/2 --osd-journal=td/osd-backfill-recovery-log/2/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:33.712 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:33.713 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:33.716 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:33.729 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:33.725+0000 7fdd7fe078c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:33.733 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:33.729+0000 7fdd7fe078c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:33.735 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:33.729+0000 7fdd7fe078c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:33.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:34.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:35.159 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:35.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:35.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:35.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:35.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:35.160 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:35.181 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:35.177+0000 7fdd7fe078c0 -1 Falling back to public interface 2026-03-08T22:47:35.380 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:35.939 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:35.933+0000 7fdd7fe078c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:47:36.381 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:36.381 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:36.381 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:36.381 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:36.381 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:36.381 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:36.643 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:37.644 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:47:37.644 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:37.644 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:37.644 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:37.644 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:37.645 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:37.871 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:38.872 INFO:tasks.workunit.client.0.vm06.stdout:4 2026-03-08T22:47:38.873 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:38.873 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:38.873 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:47:38.873 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:38.873 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:39.129 INFO:tasks.workunit.client.0.vm06.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/329636669,v1:127.0.0.1:6819/329636669] [v2:127.0.0.1:6820/329636669,v1:127.0.0.1:6821/329636669] exists,up ceac4932-41ee-42a3-93c5-e9b3b7816782 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:57: _common_test: run_osd td/osd-backfill-recovery-log 3 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-recovery-log/3 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-recovery-log/3' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-recovery-log/3/journal' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-recovery-log' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:39.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-recovery-log/$name.log' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-recovery-log/$name.pid' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:39.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-recovery-log/3 2026-03-08T22:47:39.132 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:39.133 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5aea30bf-b818-499d-86df-2ab6e020b871 2026-03-08T22:47:39.133 INFO:tasks.workunit.client.0.vm06.stdout:add osd3 5aea30bf-b818-499d-86df-2ab6e020b871 2026-03-08T22:47:39.133 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 5aea30bf-b818-499d-86df-2ab6e020b871' 2026-03-08T22:47:39.133 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:39.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCL/K1pkK6dCBAA0SC0a8eS2omI+rk+vCAv1g== 2026-03-08T22:47:39.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCL/K1pkK6dCBAA0SC0a8eS2omI+rk+vCAv1g=="}' 2026-03-08T22:47:39.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5aea30bf-b818-499d-86df-2ab6e020b871 -i td/osd-backfill-recovery-log/3/new.json 2026-03-08T22:47:39.386 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:47:39.398 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-recovery-log/3/new.json 2026-03-08T22:47:39.399 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/3 --osd-journal=td/osd-backfill-recovery-log/3/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCL/K1pkK6dCBAA0SC0a8eS2omI+rk+vCAv1g== --osd-uuid 5aea30bf-b818-499d-86df-2ab6e020b871 2026-03-08T22:47:39.416 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:39.409+0000 7f6080daa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:39.417 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:39.413+0000 7f6080daa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:39.418 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:39.413+0000 7f6080daa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:39.418 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:39.413+0000 7f6080daa8c0 -1 bdev(0x559316dbdc00 td/osd-backfill-recovery-log/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:39.419 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:39.413+0000 7f6080daa8c0 -1 bluestore(td/osd-backfill-recovery-log/3) _read_fsid unparsable uuid 2026-03-08T22:47:41.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-recovery-log/3/keyring 2026-03-08T22:47:41.937 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:41.937 INFO:tasks.workunit.client.0.vm06.stdout:adding osd3 key to auth repository 2026-03-08T22:47:41.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:47:41.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-recovery-log/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:42.215 INFO:tasks.workunit.client.0.vm06.stdout:start osd.3 2026-03-08T22:47:42.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:47:42.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/3 --osd-journal=td/osd-backfill-recovery-log/3/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:42.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:42.216 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:42.219 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:42.233 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:42.229+0000 7f42d11058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:42.234 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:42.229+0000 7f42d11058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:42.236 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:42.229+0000 7f42d11058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:42.443 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:47:42.665 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:42.705 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:42.701+0000 7f42d11058c0 -1 Falling back to public interface 2026-03-08T22:47:43.666 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:43.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:43.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:43.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:43.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:43.667 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:47:43.892 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:44.177 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:44.173+0000 7f42d11058c0 -1 osd.3 0 log_to_monitors true 2026-03-08T22:47:44.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:44.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:44.893 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:44.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:44.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:44.893 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:47:45.154 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:46.158 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:47:46.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:46.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:46.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:46.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:46.158 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:47:46.365 INFO:tasks.workunit.client.0.vm06.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3996478769,v1:127.0.0.1:6827/3996478769] [v2:127.0.0.1:6828/3996478769,v1:127.0.0.1:6829/3996478769] exists,up 5aea30bf-b818-499d-86df-2ab6e020b871 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:57: _common_test: run_osd td/osd-backfill-recovery-log 4 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-recovery-log/4 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-recovery-log/4' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-recovery-log/4/journal' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-recovery-log' 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:46.366 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-recovery-log/$name.log' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-recovery-log/$name.pid' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-recovery-log/4 2026-03-08T22:47:46.367 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:46.368 INFO:tasks.workunit.client.0.vm06.stdout:add osd4 9faa8aa0-b564-45e0-9975-e0e5397aed7f 2026-03-08T22:47:46.368 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9faa8aa0-b564-45e0-9975-e0e5397aed7f 2026-03-08T22:47:46.368 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 9faa8aa0-b564-45e0-9975-e0e5397aed7f' 2026-03-08T22:47:46.368 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:46.380 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCS/K1pfvyiFhAAJfCyf81h3ofUlDl+wv9rvA== 2026-03-08T22:47:46.380 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCS/K1pfvyiFhAAJfCyf81h3ofUlDl+wv9rvA=="}' 2026-03-08T22:47:46.380 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9faa8aa0-b564-45e0-9975-e0e5397aed7f -i td/osd-backfill-recovery-log/4/new.json 2026-03-08T22:47:46.596 INFO:tasks.workunit.client.0.vm06.stdout:4 2026-03-08T22:47:46.607 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-recovery-log/4/new.json 2026-03-08T22:47:46.608 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/4 --osd-journal=td/osd-backfill-recovery-log/4/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCS/K1pfvyiFhAAJfCyf81h3ofUlDl+wv9rvA== --osd-uuid 9faa8aa0-b564-45e0-9975-e0e5397aed7f 2026-03-08T22:47:46.624 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:46.621+0000 7f00348858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:46.627 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:46.621+0000 7f00348858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:46.627 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:46.621+0000 7f00348858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:46.628 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:46.621+0000 7f00348858c0 -1 bdev(0x564e0af81c00 td/osd-backfill-recovery-log/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:46.628 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:46.621+0000 7f00348858c0 -1 bluestore(td/osd-backfill-recovery-log/4) _read_fsid unparsable uuid 2026-03-08T22:47:49.629 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-recovery-log/4/keyring 2026-03-08T22:47:49.629 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:49.630 INFO:tasks.workunit.client.0.vm06.stdout:adding osd4 key to auth repository 2026-03-08T22:47:49.630 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:47:49.630 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-recovery-log/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:49.915 INFO:tasks.workunit.client.0.vm06.stdout:start osd.4 2026-03-08T22:47:49.915 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:47:49.915 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/4 --osd-journal=td/osd-backfill-recovery-log/4/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:49.916 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:49.916 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:49.920 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:49.932 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:49.925+0000 7fdd4ae948c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:49.940 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:49.933+0000 7fdd4ae948c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:49.944 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:49.937+0000 7fdd4ae948c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:50.136 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:47:50.351 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:50.889 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:50.885+0000 7fdd4ae948c0 -1 Falling back to public interface 2026-03-08T22:47:51.352 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:51.352 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:51.352 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:51.352 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:51.352 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:51.352 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:47:51.564 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:51.860 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:51.853+0000 7fdd4ae948c0 -1 osd.4 0 log_to_monitors true 2026-03-08T22:47:52.565 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:52.565 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:52.565 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:52.565 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:52.565 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:52.567 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:47:52.799 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:52.916 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:52.909+0000 7fdd4664d640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T22:47:53.800 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:53.800 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:53.800 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:53.800 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:47:53.800 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:53.801 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:47:54.014 INFO:tasks.workunit.client.0.vm06.stdout:osd.4 up in weight 1 up_from 25 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2018567070,v1:127.0.0.1:6835/2018567070] [v2:127.0.0.1:6836/2018567070,v1:127.0.0.1:6837/2018567070] exists,up 9faa8aa0-b564-45e0-9975-e0e5397aed7f 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:55: _common_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:57: _common_test: run_osd td/osd-backfill-recovery-log 5 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-backfill-recovery-log 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-backfill-recovery-log/5 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-backfill-recovery-log/5' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-backfill-recovery-log/5/journal' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-backfill-recovery-log' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:47:54.015 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-backfill-recovery-log/$name.log' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-backfill-recovery-log/$name.pid' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:54.016 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-backfill-recovery-log/5 2026-03-08T22:47:54.017 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:54.017 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4ebc952d-3874-486a-a1c9-a74b9b85107e 2026-03-08T22:47:54.017 INFO:tasks.workunit.client.0.vm06.stdout:add osd5 4ebc952d-3874-486a-a1c9-a74b9b85107e 2026-03-08T22:47:54.017 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 4ebc952d-3874-486a-a1c9-a74b9b85107e' 2026-03-08T22:47:54.017 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:54.031 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCa/K1phnzYARAAS78RTyLH3mEqRhjAxoYpYQ== 2026-03-08T22:47:54.031 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCa/K1phnzYARAAS78RTyLH3mEqRhjAxoYpYQ=="}' 2026-03-08T22:47:54.031 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4ebc952d-3874-486a-a1c9-a74b9b85107e -i td/osd-backfill-recovery-log/5/new.json 2026-03-08T22:47:54.252 INFO:tasks.workunit.client.0.vm06.stdout:5 2026-03-08T22:47:54.263 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-backfill-recovery-log/5/new.json 2026-03-08T22:47:54.264 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/5 --osd-journal=td/osd-backfill-recovery-log/5/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCa/K1phnzYARAAS78RTyLH3mEqRhjAxoYpYQ== --osd-uuid 4ebc952d-3874-486a-a1c9-a74b9b85107e 2026-03-08T22:47:54.280 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:54.277+0000 7fc108f6f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:54.282 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:54.277+0000 7fc108f6f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:54.283 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:54.277+0000 7fc108f6f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:54.283 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:54.277+0000 7fc108f6f8c0 -1 bdev(0x55c125ba9c00 td/osd-backfill-recovery-log/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:54.283 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:54.277+0000 7fc108f6f8c0 -1 bluestore(td/osd-backfill-recovery-log/5) _read_fsid unparsable uuid 2026-03-08T22:47:56.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-backfill-recovery-log/5/keyring 2026-03-08T22:47:56.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:56.537 INFO:tasks.workunit.client.0.vm06.stdout:adding osd5 key to auth repository 2026-03-08T22:47:56.538 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:47:56.538 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-backfill-recovery-log/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:56.805 INFO:tasks.workunit.client.0.vm06.stdout:start osd.5 2026-03-08T22:47:56.806 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:47:56.806 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=c783216f-f8ee-4c2d-9c44-aa92c26fac3c --auth-supported=none --mon-host=127.0.0.1:7129 --osd_max_backfills=1 --debug_reserver=20 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-backfill-recovery-log/5 --osd-journal=td/osd-backfill-recovery-log/5/journal --chdir= --osd_min_pg_log_entries=1 --osd_max_pg_log_entries=2 --osd_pg_log_dups_tracked=10 --run-dir=td/osd-backfill-recovery-log '--admin-socket=/tmp/ceph-asok.42776/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-backfill-recovery-log/$name.log' '--pid-file=td/osd-backfill-recovery-log/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:56.806 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:56.807 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:56.809 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:56.821 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:56.817+0000 7fcb5d9438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:56.829 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:56.825+0000 7fcb5d9438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:56.830 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:56.825+0000 7fcb5d9438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stdout:0 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:57.030 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:47:57.243 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:57.789 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:57.785+0000 7fcb5d9438c0 -1 Falling back to public interface 2026-03-08T22:47:58.244 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:58.244 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:58.244 INFO:tasks.workunit.client.0.vm06.stdout:1 2026-03-08T22:47:58.244 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:58.244 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:58.244 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:47:58.451 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:58.760 INFO:tasks.workunit.client.0.vm06.stderr:2026-03-08T22:47:58.753+0000 7fcb5d9438c0 -1 osd.5 0 log_to_monitors true 2026-03-08T22:47:59.452 INFO:tasks.workunit.client.0.vm06.stdout:2 2026-03-08T22:47:59.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:59.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:59.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:59.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:59.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:47:59.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:00.680 INFO:tasks.workunit.client.0.vm06.stdout:3 2026-03-08T22:48:00.680 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:00.680 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:00.680 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:00.680 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:00.680 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:48:00.884 INFO:tasks.workunit.client.0.vm06.stdout:osd.5 up in weight 1 up_from 30 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/3328865812,v1:127.0.0.1:6843/3328865812] [v2:127.0.0.1:6844/3328865812,v1:127.0.0.1:6845/3328865812] exists,up 4ebc952d-3874-486a-a1c9-a74b9b85107e 2026-03-08T22:48:00.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:00.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:00.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:00.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:60: _common_test: create_pool test 1 1 2026-03-08T22:48:00.884 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T22:48:01.132 INFO:tasks.workunit.client.0.vm06.stderr:pool 'test' already exists 2026-03-08T22:48:01.144 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:48:02.145 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: seq 1 150 2026-03-08T22:48:02.146 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.146 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-1 /etc/passwd 2026-03-08T22:48:02.170 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.170 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-2 /etc/passwd 2026-03-08T22:48:02.196 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.196 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-3 /etc/passwd 2026-03-08T22:48:02.223 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.223 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-4 /etc/passwd 2026-03-08T22:48:02.250 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.250 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-5 /etc/passwd 2026-03-08T22:48:02.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.278 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-6 /etc/passwd 2026-03-08T22:48:02.300 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.300 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-7 /etc/passwd 2026-03-08T22:48:02.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-8 /etc/passwd 2026-03-08T22:48:02.350 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.350 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-9 /etc/passwd 2026-03-08T22:48:02.378 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.378 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-10 /etc/passwd 2026-03-08T22:48:02.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.404 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-11 /etc/passwd 2026-03-08T22:48:02.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.432 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-12 /etc/passwd 2026-03-08T22:48:02.460 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.460 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-13 /etc/passwd 2026-03-08T22:48:02.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.488 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-14 /etc/passwd 2026-03-08T22:48:02.517 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.517 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-15 /etc/passwd 2026-03-08T22:48:02.544 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.544 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-16 /etc/passwd 2026-03-08T22:48:02.573 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.573 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-17 /etc/passwd 2026-03-08T22:48:02.598 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.598 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-18 /etc/passwd 2026-03-08T22:48:02.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.621 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-19 /etc/passwd 2026-03-08T22:48:02.646 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.646 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-20 /etc/passwd 2026-03-08T22:48:02.670 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.670 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-21 /etc/passwd 2026-03-08T22:48:02.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.696 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-22 /etc/passwd 2026-03-08T22:48:02.718 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.718 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-23 /etc/passwd 2026-03-08T22:48:02.743 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.743 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-24 /etc/passwd 2026-03-08T22:48:02.764 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.764 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-25 /etc/passwd 2026-03-08T22:48:02.786 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.786 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-26 /etc/passwd 2026-03-08T22:48:02.807 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-27 /etc/passwd 2026-03-08T22:48:02.829 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.829 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-28 /etc/passwd 2026-03-08T22:48:02.850 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.850 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-29 /etc/passwd 2026-03-08T22:48:02.871 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.871 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-30 /etc/passwd 2026-03-08T22:48:02.896 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.896 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-31 /etc/passwd 2026-03-08T22:48:02.917 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.917 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-32 /etc/passwd 2026-03-08T22:48:02.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-33 /etc/passwd 2026-03-08T22:48:02.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-34 /etc/passwd 2026-03-08T22:48:02.981 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:02.981 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-35 /etc/passwd 2026-03-08T22:48:03.002 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.002 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-36 /etc/passwd 2026-03-08T22:48:03.025 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.026 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-37 /etc/passwd 2026-03-08T22:48:03.047 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.047 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-38 /etc/passwd 2026-03-08T22:48:03.070 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.070 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-39 /etc/passwd 2026-03-08T22:48:03.093 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.094 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-40 /etc/passwd 2026-03-08T22:48:03.117 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.117 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-41 /etc/passwd 2026-03-08T22:48:03.139 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.139 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-42 /etc/passwd 2026-03-08T22:48:03.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.161 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-43 /etc/passwd 2026-03-08T22:48:03.183 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.183 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-44 /etc/passwd 2026-03-08T22:48:03.205 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.206 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-45 /etc/passwd 2026-03-08T22:48:03.228 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.228 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-46 /etc/passwd 2026-03-08T22:48:03.250 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.250 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-47 /etc/passwd 2026-03-08T22:48:03.272 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.272 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-48 /etc/passwd 2026-03-08T22:48:03.294 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.294 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-49 /etc/passwd 2026-03-08T22:48:03.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.318 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-50 /etc/passwd 2026-03-08T22:48:03.341 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.341 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-51 /etc/passwd 2026-03-08T22:48:03.363 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.363 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-52 /etc/passwd 2026-03-08T22:48:03.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.385 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-53 /etc/passwd 2026-03-08T22:48:03.408 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.408 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-54 /etc/passwd 2026-03-08T22:48:03.430 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.430 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-55 /etc/passwd 2026-03-08T22:48:03.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.452 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-56 /etc/passwd 2026-03-08T22:48:03.474 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.474 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-57 /etc/passwd 2026-03-08T22:48:03.499 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.499 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-58 /etc/passwd 2026-03-08T22:48:03.526 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.526 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-59 /etc/passwd 2026-03-08T22:48:03.549 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.549 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-60 /etc/passwd 2026-03-08T22:48:03.571 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.571 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-61 /etc/passwd 2026-03-08T22:48:03.593 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.593 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-62 /etc/passwd 2026-03-08T22:48:03.613 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.614 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-63 /etc/passwd 2026-03-08T22:48:03.635 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.635 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-64 /etc/passwd 2026-03-08T22:48:03.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.657 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-65 /etc/passwd 2026-03-08T22:48:03.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.679 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-66 /etc/passwd 2026-03-08T22:48:03.706 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.706 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-67 /etc/passwd 2026-03-08T22:48:03.728 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.728 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-68 /etc/passwd 2026-03-08T22:48:03.749 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.749 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-69 /etc/passwd 2026-03-08T22:48:03.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.774 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-70 /etc/passwd 2026-03-08T22:48:03.795 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.795 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-71 /etc/passwd 2026-03-08T22:48:03.816 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.816 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-72 /etc/passwd 2026-03-08T22:48:03.838 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.838 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-73 /etc/passwd 2026-03-08T22:48:03.859 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.860 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-74 /etc/passwd 2026-03-08T22:48:03.881 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.881 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-75 /etc/passwd 2026-03-08T22:48:03.901 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.902 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-76 /etc/passwd 2026-03-08T22:48:03.923 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.923 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-77 /etc/passwd 2026-03-08T22:48:03.945 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.945 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-78 /etc/passwd 2026-03-08T22:48:03.965 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.965 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-79 /etc/passwd 2026-03-08T22:48:03.986 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:03.986 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-80 /etc/passwd 2026-03-08T22:48:04.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.007 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-81 /etc/passwd 2026-03-08T22:48:04.027 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.027 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-82 /etc/passwd 2026-03-08T22:48:04.048 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.048 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-83 /etc/passwd 2026-03-08T22:48:04.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.069 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-84 /etc/passwd 2026-03-08T22:48:04.089 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.089 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-85 /etc/passwd 2026-03-08T22:48:04.108 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-86 /etc/passwd 2026-03-08T22:48:04.128 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.128 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-87 /etc/passwd 2026-03-08T22:48:04.148 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.148 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-88 /etc/passwd 2026-03-08T22:48:04.168 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.168 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-89 /etc/passwd 2026-03-08T22:48:04.188 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.189 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-90 /etc/passwd 2026-03-08T22:48:04.209 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.210 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-91 /etc/passwd 2026-03-08T22:48:04.230 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.230 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-92 /etc/passwd 2026-03-08T22:48:04.252 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.252 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-93 /etc/passwd 2026-03-08T22:48:04.273 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.273 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-94 /etc/passwd 2026-03-08T22:48:04.292 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.292 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-95 /etc/passwd 2026-03-08T22:48:04.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.313 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-96 /etc/passwd 2026-03-08T22:48:04.336 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.336 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-97 /etc/passwd 2026-03-08T22:48:04.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.359 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-98 /etc/passwd 2026-03-08T22:48:04.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-99 /etc/passwd 2026-03-08T22:48:04.406 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.406 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-100 /etc/passwd 2026-03-08T22:48:04.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.427 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-101 /etc/passwd 2026-03-08T22:48:04.448 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.448 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-102 /etc/passwd 2026-03-08T22:48:04.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.469 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-103 /etc/passwd 2026-03-08T22:48:04.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.490 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-104 /etc/passwd 2026-03-08T22:48:04.511 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.511 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-105 /etc/passwd 2026-03-08T22:48:04.532 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.532 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-106 /etc/passwd 2026-03-08T22:48:04.553 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.553 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-107 /etc/passwd 2026-03-08T22:48:04.576 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.576 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-108 /etc/passwd 2026-03-08T22:48:04.597 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.597 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-109 /etc/passwd 2026-03-08T22:48:04.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.618 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-110 /etc/passwd 2026-03-08T22:48:04.639 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.639 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-111 /etc/passwd 2026-03-08T22:48:04.661 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.661 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-112 /etc/passwd 2026-03-08T22:48:04.683 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.683 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-113 /etc/passwd 2026-03-08T22:48:04.706 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.706 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-114 /etc/passwd 2026-03-08T22:48:04.730 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.730 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-115 /etc/passwd 2026-03-08T22:48:04.753 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.753 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-116 /etc/passwd 2026-03-08T22:48:04.776 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.776 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-117 /etc/passwd 2026-03-08T22:48:04.798 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.798 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-118 /etc/passwd 2026-03-08T22:48:04.820 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.820 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-119 /etc/passwd 2026-03-08T22:48:04.843 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.843 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-120 /etc/passwd 2026-03-08T22:48:04.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.867 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-121 /etc/passwd 2026-03-08T22:48:04.890 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.890 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-122 /etc/passwd 2026-03-08T22:48:04.914 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.914 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-123 /etc/passwd 2026-03-08T22:48:04.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.938 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-124 /etc/passwd 2026-03-08T22:48:04.963 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.963 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-125 /etc/passwd 2026-03-08T22:48:04.990 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:04.990 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-126 /etc/passwd 2026-03-08T22:48:05.013 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.014 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-127 /etc/passwd 2026-03-08T22:48:05.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.036 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-128 /etc/passwd 2026-03-08T22:48:05.059 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.059 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-129 /etc/passwd 2026-03-08T22:48:05.082 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.082 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-130 /etc/passwd 2026-03-08T22:48:05.104 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.104 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-131 /etc/passwd 2026-03-08T22:48:05.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.127 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-132 /etc/passwd 2026-03-08T22:48:05.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.151 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-133 /etc/passwd 2026-03-08T22:48:05.173 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.173 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-134 /etc/passwd 2026-03-08T22:48:05.195 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.195 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-135 /etc/passwd 2026-03-08T22:48:05.219 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.220 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-136 /etc/passwd 2026-03-08T22:48:05.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.241 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-137 /etc/passwd 2026-03-08T22:48:05.263 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.263 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-138 /etc/passwd 2026-03-08T22:48:05.287 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.287 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-139 /etc/passwd 2026-03-08T22:48:05.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.310 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-140 /etc/passwd 2026-03-08T22:48:05.332 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.332 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-141 /etc/passwd 2026-03-08T22:48:05.355 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.355 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-142 /etc/passwd 2026-03-08T22:48:05.378 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.378 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-143 /etc/passwd 2026-03-08T22:48:05.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.401 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-144 /etc/passwd 2026-03-08T22:48:05.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.424 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-145 /etc/passwd 2026-03-08T22:48:05.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-146 /etc/passwd 2026-03-08T22:48:05.482 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.482 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-147 /etc/passwd 2026-03-08T22:48:05.508 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.508 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-148 /etc/passwd 2026-03-08T22:48:05.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.536 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-149 /etc/passwd 2026-03-08T22:48:05.562 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:62: _common_test: for j in $(seq 1 $objects) 2026-03-08T22:48:05.562 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:64: _common_test: rados -p test put obj-150 /etc/passwd 2026-03-08T22:48:05.592 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:68: _common_test: ceph pg dump pgs --format=json 2026-03-08T22:48:05.592 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:68: _common_test: jq '.pg_stats[0].up[]' 2026-03-08T22:48:05.796 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:48:05.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:68: _common_test: ceph osd out 2026-03-08T22:48:05.963 INFO:tasks.workunit.client.0.vm06.stderr:Invalid command: saw 0 of ids()..., expected at least 1 2026-03-08T22:48:05.963 INFO:tasks.workunit.client.0.vm06.stderr:osd out ... : set osd(s) [...] out, or use to set all osds out 2026-03-08T22:48:05.963 INFO:tasks.workunit.client.0.vm06.stderr:Error EINVAL: invalid command 2026-03-08T22:48:05.967 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:69: _common_test: '[' 0 '!=' 0 ']' 2026-03-08T22:48:05.967 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:75: _common_test: sleep 1 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:76: _common_test: wait_for_clean 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:06.968 INFO:tasks.workunit.client.0.vm06.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:06.969 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:06.969 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:06.969 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:07.022 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:07.023 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:07.023 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:07.023 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:07.023 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:07.023 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:07.230 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:07.230 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:48:07.230 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:48:07.231 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:48:07.231 INFO:tasks.workunit.client.0.vm06.stderr:4 2026-03-08T22:48:07.231 INFO:tasks.workunit.client.0.vm06.stderr:5' 2026-03-08T22:48:07.231 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:07.231 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.231 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:07.306 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836491 2026-03-08T22:48:07.306 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836491 2026-03-08T22:48:07.306 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491' 2026-03-08T22:48:07.306 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.306 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:07.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T22:48:07.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T22:48:07.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970' 2026-03-08T22:48:07.383 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.384 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:48:07.460 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509448 2026-03-08T22:48:07.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509448 2026-03-08T22:48:07.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970 2-64424509448' 2026-03-08T22:48:07.461 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.461 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:48:07.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345927 2026-03-08T22:48:07.540 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345927 2026-03-08T22:48:07.540 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970 2-64424509448 3-85899345927' 2026-03-08T22:48:07.540 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.540 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:48:07.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182405 2026-03-08T22:48:07.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182405 2026-03-08T22:48:07.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970 2-64424509448 3-85899345927 4-107374182405' 2026-03-08T22:48:07.617 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.617 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:48:07.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018884 2026-03-08T22:48:07.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018884 2026-03-08T22:48:07.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970 2-64424509448 3-85899345927 4-107374182405 5-128849018884' 2026-03-08T22:48:07.693 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:07.693 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836491 2026-03-08T22:48:07.693 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:07.694 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:07.695 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836491 2026-03-08T22:48:07.695 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:07.695 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836491 2026-03-08T22:48:07.695 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836491 2026-03-08T22:48:07.695 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836491' 2026-03-08T22:48:07.696 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:07.899 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836491 2026-03-08T22:48:07.899 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:08.900 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:48:08.900 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:09.113 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836491 2026-03-08T22:48:09.113 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:09.113 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T22:48:09.113 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:09.114 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:09.115 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:09.115 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T22:48:09.115 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T22:48:09.116 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 42949672970 2026-03-08T22:48:09.116 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T22:48:09.116 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:09.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672970 2026-03-08T22:48:09.325 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:09.325 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509448 2026-03-08T22:48:09.325 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:09.326 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:48:09.327 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509448 2026-03-08T22:48:09.327 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:09.327 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509448 2026-03-08T22:48:09.328 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509448' 2026-03-08T22:48:09.328 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 64424509448 2026-03-08T22:48:09.328 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:48:09.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509449 -lt 64424509448 2026-03-08T22:48:09.537 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:09.537 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345927 2026-03-08T22:48:09.537 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:09.538 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:48:09.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345927 2026-03-08T22:48:09.539 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:09.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345927 2026-03-08T22:48:09.539 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345927' 2026-03-08T22:48:09.539 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 85899345927 2026-03-08T22:48:09.540 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:48:09.746 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345927 -lt 85899345927 2026-03-08T22:48:09.746 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:09.746 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182405 2026-03-08T22:48:09.746 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:09.747 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:48:09.748 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182405 2026-03-08T22:48:09.748 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:09.748 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182405 2026-03-08T22:48:09.748 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182405' 2026-03-08T22:48:09.749 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 107374182405 2026-03-08T22:48:09.749 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:48:09.957 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182406 -lt 107374182405 2026-03-08T22:48:09.958 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:09.958 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018884 2026-03-08T22:48:09.958 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:09.959 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:48:09.959 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018884 2026-03-08T22:48:09.959 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:09.960 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018884 2026-03-08T22:48:09.960 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.5 seq 128849018884 2026-03-08T22:48:09.960 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018884' 2026-03-08T22:48:09.960 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:48:10.173 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018884 -lt 128849018884 2026-03-08T22:48:10.174 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:10.174 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:10.174 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:10.453 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:10.658 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:48:10.658 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:10.658 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:10.658 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:10.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:48:10.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:10.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:10.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:78: _common_test: flush_pg_stats 2026-03-08T22:48:10.940 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:10.941 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:1 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:2 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:3 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:4 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:5' 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:11.159 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:11.245 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836494 2026-03-08T22:48:11.245 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836494 2026-03-08T22:48:11.245 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494' 2026-03-08T22:48:11.245 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:11.245 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:11.326 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672973 2026-03-08T22:48:11.326 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672973 2026-03-08T22:48:11.326 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672973' 2026-03-08T22:48:11.326 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:11.326 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:48:11.412 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509451 2026-03-08T22:48:11.412 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509451 2026-03-08T22:48:11.412 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672973 2-64424509451' 2026-03-08T22:48:11.412 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:11.412 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:48:11.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345930 2026-03-08T22:48:11.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345930 2026-03-08T22:48:11.493 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672973 2-64424509451 3-85899345930' 2026-03-08T22:48:11.494 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:11.494 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:48:11.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182408 2026-03-08T22:48:11.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182408 2026-03-08T22:48:11.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672973 2-64424509451 3-85899345930 4-107374182408' 2026-03-08T22:48:11.579 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:11.579 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:48:11.660 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018887 2026-03-08T22:48:11.660 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018887 2026-03-08T22:48:11.660 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672973 2-64424509451 3-85899345930 4-107374182408 5-128849018887' 2026-03-08T22:48:11.660 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:11.660 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836494 2026-03-08T22:48:11.660 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:11.661 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:11.662 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836494 2026-03-08T22:48:11.662 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:11.662 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836494 2026-03-08T22:48:11.663 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.0 seq 21474836494 2026-03-08T22:48:11.663 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836494' 2026-03-08T22:48:11.663 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:11.881 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836494 2026-03-08T22:48:11.881 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:12.882 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:48:12.883 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:13.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836494 2026-03-08T22:48:13.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:13.131 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672973 2026-03-08T22:48:13.132 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:13.132 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:13.133 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672973 2026-03-08T22:48:13.133 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:13.134 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672973 2026-03-08T22:48:13.134 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.1 seq 42949672973 2026-03-08T22:48:13.134 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672973' 2026-03-08T22:48:13.135 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:13.355 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672973 -lt 42949672973 2026-03-08T22:48:13.355 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:13.355 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509451 2026-03-08T22:48:13.355 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:13.356 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:48:13.357 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509451 2026-03-08T22:48:13.357 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:13.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509451 2026-03-08T22:48:13.358 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509451' 2026-03-08T22:48:13.358 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.2 seq 64424509451 2026-03-08T22:48:13.358 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:48:13.585 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509451 -lt 64424509451 2026-03-08T22:48:13.585 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:13.585 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345930 2026-03-08T22:48:13.585 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:13.586 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:48:13.587 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345930 2026-03-08T22:48:13.587 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:13.588 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345930 2026-03-08T22:48:13.588 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.3 seq 85899345930 2026-03-08T22:48:13.588 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345930' 2026-03-08T22:48:13.588 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:48:13.805 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345930 -lt 85899345930 2026-03-08T22:48:13.805 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:13.806 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182408 2026-03-08T22:48:13.806 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:13.806 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:48:13.807 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182408 2026-03-08T22:48:13.807 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:13.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182408 2026-03-08T22:48:13.808 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182408' 2026-03-08T22:48:13.808 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.4 seq 107374182408 2026-03-08T22:48:13.808 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:48:14.021 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182408 -lt 107374182408 2026-03-08T22:48:14.021 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:14.022 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018887 2026-03-08T22:48:14.022 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:14.022 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:48:14.023 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018887 2026-03-08T22:48:14.023 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:14.024 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018887 2026-03-08T22:48:14.024 INFO:tasks.workunit.client.0.vm06.stdout:waiting osd.5 seq 128849018887 2026-03-08T22:48:14.024 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018887' 2026-03-08T22:48:14.024 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:48:14.233 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018887 -lt 128849018887 2026-03-08T22:48:14.233 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:80: _common_test: ceph pg dump pgs --format=json 2026-03-08T22:48:14.233 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:80: _common_test: jq '.pg_stats[0].up_primary' 2026-03-08T22:48:14.436 INFO:tasks.workunit.client.0.vm06.stderr:dumped pgs 2026-03-08T22:48:14.448 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:80: _common_test: newprimary=1 2026-03-08T22:48:14.448 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:81: _common_test: kill_daemons 2026-03-08T22:48:14.449 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:14.449 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:14.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:14.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:14.449 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:83: _common_test: ERRORS=0 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:84: _common_test: _objectstore_tool_nodown td/osd-backfill-recovery-log 1 --no-mon-config --pgid 1.0 --op log 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-backfill-recovery-log 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-backfill-recovery-log/1 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-backfill-recovery-log/1 --no-mon-config --pgid 1.0 --op log 2026-03-08T22:48:19.772 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:84: _common_test: tee td/osd-backfill-recovery-log/result.log 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout:{ 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "pg_log_t": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "head": "34'150", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "tail": "34'100", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "log": [ 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:be8b0193:::obj-101:head", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'101", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4515.0:1", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.443866+0000", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:6aa5e219:::obj-102:head", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'102", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4518.0:1", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.465045+0000", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.523 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:ff8cd32f:::obj-103:head", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'103", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4521.0:1", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.485969+0000", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:fcaf5c1c:::obj-104:head", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'104", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4524.0:1", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.507438+0000", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:2111aa60:::obj-105:head", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'105", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4527.0:1", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.528545+0000", 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.524 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:9fed4eb7:::obj-106:head", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'106", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4530.0:1", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.549487+0000", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:0b0c0bce:::obj-107:head", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'107", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4533.0:1", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.570065+0000", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:b7ce613a:::obj-108:head", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'108", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.525 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4536.0:1", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.593071+0000", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:4f32cd61:::obj-109:head", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'109", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4539.0:1", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.613918+0000", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:40f97f88:::obj-110:head", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'110", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4542.0:1", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.635092+0000", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.526 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:1d8e7a62:::obj-111:head", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'111", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4545.0:1", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.656849+0000", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:3a283967:::obj-112:head", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'112", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4548.0:1", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.678940+0000", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:2acdc9e7:::obj-113:head", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'113", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4551.0:1", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.701241+0000", 2026-03-08T22:48:20.527 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:7f18a9ad:::obj-114:head", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'114", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4554.0:1", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.724775+0000", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:852b8a34:::obj-115:head", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'115", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4557.0:1", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.749002+0000", 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.528 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:589de4f7:::obj-116:head", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'116", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4560.0:1", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.771918+0000", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:14f2571c:::obj-117:head", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'117", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4563.0:1", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.794084+0000", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.529 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:dd36f0e5:::obj-118:head", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'118", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4566.0:1", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.816088+0000", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:0d7fa432:::obj-119:head", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'119", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4569.0:1", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.838877+0000", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:295ec31a:::obj-120:head", 2026-03-08T22:48:20.530 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'120", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4572.0:1", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.862239+0000", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:c2863561:::obj-121:head", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'121", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4575.0:1", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.886318+0000", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:0b943493:::obj-122:head", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'122", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4578.0:1", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.908997+0000", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.531 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:4cfecbf8:::obj-123:head", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'123", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4581.0:1", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.932705+0000", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:fd18e511:::obj-124:head", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'124", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4584.0:1", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.957739+0000", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:4f299322:::obj-125:head", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'125", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4587.0:1", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:04.984860+0000", 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.532 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:302ecc7e:::obj-126:head", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'126", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4590.0:1", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.008330+0000", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:42feb657:::obj-127:head", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'127", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4593.0:1", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.031879+0000", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:5e6b9a81:::obj-128:head", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'128", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4596.0:1", 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.533 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.054082+0000", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:0d242242:::obj-129:head", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'129", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4599.0:1", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.077205+0000", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:ed409294:::obj-130:head", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'130", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4602.0:1", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.099790+0000", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.534 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:720761ff:::obj-131:head", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'131", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4605.0:1", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.122879+0000", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:7715ee72:::obj-132:head", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'132", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4608.0:1", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.146224+0000", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:580938e3:::obj-133:head", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'133", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4611.0:1", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.169583+0000", 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.535 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:05dc98b5:::obj-134:head", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'134", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4614.0:1", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.190766+0000", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:73221e51:::obj-135:head", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'135", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4617.0:1", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.214961+0000", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:3a97ede2:::obj-136:head", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'136", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4620.0:1", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.237389+0000", 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.536 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:9e95a205:::obj-137:head", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'137", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4623.0:1", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.259468+0000", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:27f846bd:::obj-138:head", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'138", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4626.0:1", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.282876+0000", 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.537 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:bb2f7605:::obj-139:head", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'139", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4629.0:1", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.305446+0000", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:27668d5f:::obj-140:head", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'140", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4632.0:1", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.327814+0000", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:0d70cb86:::obj-141:head", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'141", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4635.0:1", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.350172+0000", 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.538 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:a7b11c8d:::obj-142:head", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'142", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4638.0:1", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.373902+0000", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:100e877c:::obj-143:head", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'143", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4641.0:1", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.396750+0000", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.539 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:2c544e5a:::obj-144:head", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'144", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4644.0:1", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.419358+0000", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:35b1ddfe:::obj-145:head", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'145", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4647.0:1", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.442051+0000", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:0e8a43a7:::obj-146:head", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'146", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4650.0:1", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.472061+0000", 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.540 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:20e0adce:::obj-147:head", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'147", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4653.0:1", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.501816+0000", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:8d3e6a95:::obj-148:head", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'148", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4656.0:1", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.528348+0000", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:e9c179f9:::obj-149:head", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'149", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4659.0:1", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.555298+0000", 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.541 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "op": "modify", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "object": "1:cb39fd60:::obj-150:head", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'150", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "prior_version": "0'0", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4662.0:1", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "extra_reqids": [], 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "mtime": "2026-03-08T22:48:05.581250+0000", 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": 0, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "mod_desc": { 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "object_mod_desc": { 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "can_local_rollback": false, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "rollback_info_completed": false, 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: "ops": [] 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:20.542 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: "clean_regions": { 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: "object_clean_regions": { 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: "clean_offsets": "[2163~18446744073709549452]", 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: "clean_omap": true, 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: "new_object": false 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:21.054 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: ], 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "dups": [ 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4494.0:1", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'94", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "94", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4497.0:1", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'95", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "95", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4500.0:1", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'96", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "96", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4503.0:1", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'97", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "97", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4506.0:1", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'98", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "98", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4509.0:1", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'99", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "99", 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.055 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: { 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "reqid": "client.4512.0:1", 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "version": "34'100", 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "user_version": "100", 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "return_code": "0" 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: ] 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: }, 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "pg_missing_t": { 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "missing": [], 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: "may_include_deletes": true 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout: } 2026-03-08T22:48:21.056 INFO:tasks.workunit.client.0.vm06.stdout:} 2026-03-08T22:48:21.058 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:85: _common_test: jq '.pg_log_t.log | length' td/osd-backfill-recovery-log/result.log 2026-03-08T22:48:21.073 INFO:tasks.workunit.client.0.vm06.stdout:FAILED: Wrong log length got 50 (expected 2) 2026-03-08T22:48:21.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:85: _common_test: LOGLEN=50 2026-03-08T22:48:21.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:86: _common_test: '[' 50 '!=' 2 ']' 2026-03-08T22:48:21.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:87: _common_test: echo 'FAILED: Wrong log length got 50 (expected 2)' 2026-03-08T22:48:21.074 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:88: _common_test: expr 0 + 1 2026-03-08T22:48:21.074 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:88: _common_test: ERRORS=1 2026-03-08T22:48:21.075 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:90: _common_test: jq '.pg_log_t.dups | length' td/osd-backfill-recovery-log/result.log 2026-03-08T22:48:21.086 INFO:tasks.workunit.client.0.vm06.stdout:FAILED: Wrong dups length got 7 (expected 8) 2026-03-08T22:48:21.086 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:90: _common_test: DUPSLEN=7 2026-03-08T22:48:21.086 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:91: _common_test: '[' 7 '!=' 8 ']' 2026-03-08T22:48:21.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:92: _common_test: echo 'FAILED: Wrong dups length got 7 (expected 8)' 2026-03-08T22:48:21.087 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:93: _common_test: expr 1 + 1 2026-03-08T22:48:21.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:93: _common_test: ERRORS=2 2026-03-08T22:48:21.087 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:95: _common_test: grep 'copy_up_to\|copy_after' td/osd-backfill-recovery-log/osd.0.log td/osd-backfill-recovery-log/osd.1.log td/osd-backfill-recovery-log/osd.2.log td/osd-backfill-recovery-log/osd.3.log td/osd-backfill-recovery-log/osd.4.log td/osd-backfill-recovery-log/osd.5.log 2026-03-08T22:48:21.095 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:96: _common_test: rm -f td/osd-backfill-recovery-log/result.log 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stdout:TEST FAILED 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:97: _common_test: '[' 2 '!=' 0 ']' 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:98: _common_test: echo TEST FAILED 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:99: _common_test: return 1 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh:34: run: return 1 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2379: main: code=1 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-backfill-recovery-log 1 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-backfill-recovery-log 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=1 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-backfill-recovery-log KILL 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:21.096 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:21.097 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:21.097 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:21.097 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:21.108 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:21.109 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:48:21.109 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:48:21.109 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:48:21.110 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:48:21.110 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:48:21.110 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:48:21.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:21.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:48:21.111 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:48:21.111 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:21.112 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:21.113 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 1 = 1 ']' 2026-03-08T22:48:21.113 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: '[' -n '' ']' 2026-03-08T22:48:21.113 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:194: teardown: mkdir -p /home/ubuntu/cephtest/archive/log 2026-03-08T22:48:21.114 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:195: teardown: mv td/osd-backfill-recovery-log/mgr.x.log td/osd-backfill-recovery-log/mon.a.log td/osd-backfill-recovery-log/osd.0.log td/osd-backfill-recovery-log/osd.1.log td/osd-backfill-recovery-log/osd.2.log td/osd-backfill-recovery-log/osd.3.log td/osd-backfill-recovery-log/osd.4.log td/osd-backfill-recovery-log/osd.5.log /home/ubuntu/cephtest/archive/log 2026-03-08T22:48:21.114 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-backfill-recovery-log 2026-03-08T22:48:21.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:48:21.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:21.130 INFO:tasks.workunit.client.0.vm06.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.42776 2026-03-08T22:48:21.130 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.42776 2026-03-08T22:48:21.131 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:48:21.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:48:21.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:48:21.131 INFO:tasks.workunit.client.0.vm06.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 1 2026-03-08T22:48:21.131 INFO:tasks.workunit:Stopping ['osd-backfill'] on client.0... 2026-03-08T22:48:21.132 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-08T22:48:21.535 ERROR:teuthology.run_tasks:Saw exception from tasks. Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 105, in run_tasks manager = run_one_task(taskname, ctx=ctx, config=config) File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 83, in run_one_task return task(**kwargs) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/workunit.py", line 144, in task _spawn_on_all_clients(ctx, refspec, all_tasks, config.get('env'), File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/workunit.py", line 292, in _spawn_on_all_clients with parallel() as p: File "/home/teuthos/teuthology/teuthology/parallel.py", line 84, in __exit__ for result in self: File "/home/teuthos/teuthology/teuthology/parallel.py", line 98, in __next__ resurrect_traceback(result) File "/home/teuthos/teuthology/teuthology/parallel.py", line 30, in resurrect_traceback raise exc.exc_info[1] File "/home/teuthos/teuthology/teuthology/parallel.py", line 23, in capture_traceback return func(*args, **kwargs) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/workunit.py", line 433, in _run_tests remote.run( File "/home/teuthos/teuthology/teuthology/orchestra/remote.py", line 575, in run r = self._runner(client=self.ssh, name=self.shortname, **kwargs) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 461, in run r.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed (workunit test osd-backfill/osd-backfill-recovery-log.sh) on vm06 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh' 2026-03-08T22:48:21.536 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-08T22:48:21.538 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-08T22:48:21.538 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-08T22:48:21.552 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-08T22:48:21.552 DEBUG:teuthology.orchestra.run.vm06:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-08T22:48:21.621 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:21.775 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:21.775 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:21.884 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:21.884 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:48:21.884 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:48:21.884 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:21.893 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:21.894 INFO:teuthology.orchestra.run.vm06.stdout: ceph* 2026-03-08T22:48:22.054 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:48:22.054 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-08T22:48:22.086 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118605 files and directories currently installed.) 2026-03-08T22:48:22.088 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:23.045 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:23.077 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:23.217 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:23.217 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:23.316 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:23.316 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:48:23.316 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-08T22:48:23.316 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:23.323 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:23.324 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm* cephadm* 2026-03-08T22:48:23.468 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-08T22:48:23.468 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 1775 kB disk space will be freed. 2026-03-08T22:48:23.499 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118603 files and directories currently installed.) 2026-03-08T22:48:23.501 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:23.518 INFO:teuthology.orchestra.run.vm06.stdout:Removing cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:23.544 INFO:teuthology.orchestra.run.vm06.stdout:Looking for files to backup/remove ... 2026-03-08T22:48:23.545 INFO:teuthology.orchestra.run.vm06.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-08T22:48:23.547 INFO:teuthology.orchestra.run.vm06.stdout:Removing user `cephadm' ... 2026-03-08T22:48:23.547 INFO:teuthology.orchestra.run.vm06.stdout:Warning: group `nogroup' has no more members. 2026-03-08T22:48:23.558 INFO:teuthology.orchestra.run.vm06.stdout:Done. 2026-03-08T22:48:23.577 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:48:23.661 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-08T22:48:23.662 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:24.614 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:24.646 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:24.791 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:24.791 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:24.893 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:24.893 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:48:24.893 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-08T22:48:24.893 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:24.901 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:24.901 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds* 2026-03-08T22:48:25.054 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:48:25.054 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 7437 kB disk space will be freed. 2026-03-08T22:48:25.085 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-08T22:48:25.086 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:25.484 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:48:25.628 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-08T22:48:25.631 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:27.010 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:27.041 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:27.204 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:27.204 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:27.320 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools python3-cheroot 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-psutil python3-pyinotify 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-threadpoolctl python3-waitress python3-webob python3-websocket 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout: sg3-utils-udev 2026-03-08T22:48:27.321 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:27.330 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:27.330 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-08T22:48:27.331 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-k8sevents* 2026-03-08T22:48:27.489 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 4 to remove and 10 not upgraded. 2026-03-08T22:48:27.489 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 165 MB disk space will be freed. 2026-03-08T22:48:27.521 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-08T22:48:27.522 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:27.535 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:27.561 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:27.619 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:28.079 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-08T22:48:28.080 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:29.336 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:29.366 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:29.490 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:29.490 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:48:29.589 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:29.596 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:29.596 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-08T22:48:29.737 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-08T22:48:29.737 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 472 MB disk space will be freed. 2026-03-08T22:48:29.767 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-08T22:48:29.768 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:29.821 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:30.176 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:30.550 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:30.957 INFO:teuthology.orchestra.run.vm06.stdout:Removing radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:31.346 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:31.380 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:31.799 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:48:31.828 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:48:31.889 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117455 files and directories currently installed.) 2026-03-08T22:48:31.891 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:32.469 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:32.889 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:33.311 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:33.743 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:34.941 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:34.975 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:35.082 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:35.083 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:48:35.198 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:35.207 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:35.207 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse* 2026-03-08T22:48:35.358 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:48:35.358 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 3673 kB disk space will be freed. 2026-03-08T22:48:35.389 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117443 files and directories currently installed.) 2026-03-08T22:48:35.391 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:35.783 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:48:35.865 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-08T22:48:35.866 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:37.085 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:37.116 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:37.260 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:37.261 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:48:37.355 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:37.368 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:37.368 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:37.398 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:37.525 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:37.526 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:48:37.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:48:37.627 INFO:teuthology.orchestra.run.vm06.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:48:37.627 INFO:teuthology.orchestra.run.vm06.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:48:37.627 INFO:teuthology.orchestra.run.vm06.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:48:37.627 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:37.640 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:37.640 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:37.670 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:37.803 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:37.804 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout:Package 'radosgw' is not installed, so not removed 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T22:48:37.909 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:37.923 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:37.923 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:37.955 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:38.106 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:38.107 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet zip 2026-03-08T22:48:38.210 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:38.218 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:38.218 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-08T22:48:38.370 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 3 to remove and 10 not upgraded. 2026-03-08T22:48:38.370 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 2062 kB disk space will be freed. 2026-03-08T22:48:38.405 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-08T22:48:38.407 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:38.418 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:38.428 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:39.215 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:39.249 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:39.390 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:39.390 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:39.495 INFO:teuthology.orchestra.run.vm06.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-08T22:48:39.495 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:39.495 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:39.495 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:48:39.495 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet zip 2026-03-08T22:48:39.496 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:39.512 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:39.512 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:39.542 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:39.713 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:39.713 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:39.819 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet zip 2026-03-08T22:48:39.820 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:39.834 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:39.834 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:39.867 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:40.046 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:40.046 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:40.161 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:40.161 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:40.161 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:48:40.161 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet zip 2026-03-08T22:48:40.162 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:40.172 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:40.172 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd* 2026-03-08T22:48:40.339 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T22:48:40.340 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 1186 kB disk space will be freed. 2026-03-08T22:48:40.374 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117410 files and directories currently installed.) 2026-03-08T22:48:40.376 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:41.380 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:41.412 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:41.569 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:41.569 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:41.672 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:41.672 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:41.672 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:48:41.672 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:48:41.672 INFO:teuthology.orchestra.run.vm06.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:48:41.672 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet zip 2026-03-08T22:48:41.673 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:41.684 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:41.684 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-dev* libcephfs2* 2026-03-08T22:48:41.829 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-08T22:48:41.829 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 3202 kB disk space will be freed. 2026-03-08T22:48:41.860 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117402 files and directories currently installed.) 2026-03-08T22:48:41.862 INFO:teuthology.orchestra.run.vm06.stdout:Removing libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:41.872 INFO:teuthology.orchestra.run.vm06.stdout:Removing libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:41.894 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:48:42.781 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:42.812 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:42.975 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:42.976 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:43.077 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet zip 2026-03-08T22:48:43.078 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:43.093 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:43.093 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:43.124 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:43.263 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:43.264 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:48:43.371 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:43.379 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:43.379 INFO:teuthology.orchestra.run.vm06.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-08T22:48:43.379 INFO:teuthology.orchestra.run.vm06.stdout: qemu-block-extra* rbd-fuse* 2026-03-08T22:48:43.518 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-08T22:48:43.518 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 51.6 MB disk space will be freed. 2026-03-08T22:48:43.548 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117387 files and directories currently installed.) 2026-03-08T22:48:43.549 INFO:teuthology.orchestra.run.vm06.stdout:Removing rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:43.560 INFO:teuthology.orchestra.run.vm06.stdout:Removing libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:43.571 INFO:teuthology.orchestra.run.vm06.stdout:Removing libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:43.581 INFO:teuthology.orchestra.run.vm06.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-08T22:48:43.976 INFO:teuthology.orchestra.run.vm06.stdout:Removing librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:43.989 INFO:teuthology.orchestra.run.vm06.stdout:Removing librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:44.001 INFO:teuthology.orchestra.run.vm06.stdout:Removing librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:44.025 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:48:44.059 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:48:44.121 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-08T22:48:44.123 INFO:teuthology.orchestra.run.vm06.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-08T22:48:45.360 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:45.391 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:45.530 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:45.530 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout:Package 'librbd1' is not installed, so not removed 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:48:45.650 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:45.665 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:45.665 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:45.695 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:45.865 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:45.865 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:48:46.013 INFO:teuthology.orchestra.run.vm06.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:48:46.029 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:48:46.029 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:46.030 DEBUG:teuthology.orchestra.run.vm06:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-08T22:48:46.084 DEBUG:teuthology.orchestra.run.vm06:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-08T22:48:46.157 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:46.318 INFO:teuthology.orchestra.run.vm06.stdout:Building dependency tree... 2026-03-08T22:48:46.318 INFO:teuthology.orchestra.run.vm06.stdout:Reading state information... 2026-03-08T22:48:46.434 INFO:teuthology.orchestra.run.vm06.stdout:The following packages will be REMOVED: 2026-03-08T22:48:46.434 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T22:48:46.434 INFO:teuthology.orchestra.run.vm06.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T22:48:46.434 INFO:teuthology.orchestra.run.vm06.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:48:46.434 INFO:teuthology.orchestra.run.vm06.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T22:48:46.435 INFO:teuthology.orchestra.run.vm06.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T22:48:46.587 INFO:teuthology.orchestra.run.vm06.stdout:0 upgraded, 0 newly installed, 87 to remove and 10 not upgraded. 2026-03-08T22:48:46.587 INFO:teuthology.orchestra.run.vm06.stdout:After this operation, 107 MB disk space will be freed. 2026-03-08T22:48:46.627 INFO:teuthology.orchestra.run.vm06.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-08T22:48:46.628 INFO:teuthology.orchestra.run.vm06.stdout:Removing ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:46.643 INFO:teuthology.orchestra.run.vm06.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:48:46.652 INFO:teuthology.orchestra.run.vm06.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-08T22:48:46.662 INFO:teuthology.orchestra.run.vm06.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-08T22:48:46.673 INFO:teuthology.orchestra.run.vm06.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-08T22:48:46.683 INFO:teuthology.orchestra.run.vm06.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:48:46.703 INFO:teuthology.orchestra.run.vm06.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:48:46.713 INFO:teuthology.orchestra.run.vm06.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:48:46.723 INFO:teuthology.orchestra.run.vm06.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:48:46.739 INFO:teuthology.orchestra.run.vm06.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:48:46.749 INFO:teuthology.orchestra.run.vm06.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:48:46.758 INFO:teuthology.orchestra.run.vm06.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:48:46.768 INFO:teuthology.orchestra.run.vm06.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:48:46.777 INFO:teuthology.orchestra.run.vm06.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:48:46.787 INFO:teuthology.orchestra.run.vm06.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T22:48:46.797 INFO:teuthology.orchestra.run.vm06.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-08T22:48:46.807 INFO:teuthology.orchestra.run.vm06.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:48:46.817 INFO:teuthology.orchestra.run.vm06.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:48:46.827 INFO:teuthology.orchestra.run.vm06.stdout:Removing luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:48:46.849 INFO:teuthology.orchestra.run.vm06.stdout:Removing liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:48:46.860 INFO:teuthology.orchestra.run.vm06.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-08T22:48:46.870 INFO:teuthology.orchestra.run.vm06.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:48:46.879 INFO:teuthology.orchestra.run.vm06.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:48:46.888 INFO:teuthology.orchestra.run.vm06.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:48:46.898 INFO:teuthology.orchestra.run.vm06.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-08T22:48:46.909 INFO:teuthology.orchestra.run.vm06.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:48:46.918 INFO:teuthology.orchestra.run.vm06.stdout:Removing libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:48:46.927 INFO:teuthology.orchestra.run.vm06.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-08T22:48:46.937 INFO:teuthology.orchestra.run.vm06.stdout:update-initramfs: deferring update (trigger activated) 2026-03-08T22:48:46.945 INFO:teuthology.orchestra.run.vm06.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-08T22:48:46.961 INFO:teuthology.orchestra.run.vm06.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-08T22:48:46.971 INFO:teuthology.orchestra.run.vm06.stdout:Removing lua-any (27ubuntu1) ... 2026-03-08T22:48:46.980 INFO:teuthology.orchestra.run.vm06.stdout:Removing lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:48:46.990 INFO:teuthology.orchestra.run.vm06.stdout:Removing lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:48:47.002 INFO:teuthology.orchestra.run.vm06.stdout:Removing lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:48:47.017 INFO:teuthology.orchestra.run.vm06.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:48:47.397 INFO:teuthology.orchestra.run.vm06.stdout:Removing pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:48:47.428 INFO:teuthology.orchestra.run.vm06.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:48:47.453 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:48:47.509 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-webtest (2.0.35-1) ... 2026-03-08T22:48:47.556 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-pastescript (2.0.2-4) ... 2026-03-08T22:48:47.607 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:48:47.652 INFO:teuthology.orchestra.run.vm06.stdout:Removing python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:48:47.663 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:48:47.713 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:48:47.961 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-08T22:48:48.009 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-08T22:48:48.055 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:48.099 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:48:48.147 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:48:48.202 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:48:48.248 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:48:48.292 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:48:48.336 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-08T22:48:48.380 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-08T22:48:48.426 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:48:48.469 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:48:48.515 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:48:48.637 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:48:48.693 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-logutils (0.3.3-8) ... 2026-03-08T22:48:48.739 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:48:48.786 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-08T22:48:48.831 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:48:48.895 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-08T22:48:48.942 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-08T22:48:48.993 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:48:49.042 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:48:49.094 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-08T22:48:49.142 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:48:49.190 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-rsa (4.8-1) ... 2026-03-08T22:48:49.251 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:48:49.297 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:48:49.348 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:48:49.392 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:48:49.415 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:48:49.459 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:48:49.504 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:48:49.548 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:48:49.591 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:48:49.635 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-08T22:48:49.682 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:48:49.728 INFO:teuthology.orchestra.run.vm06.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-08T22:48:49.770 INFO:teuthology.orchestra.run.vm06.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:48:49.791 INFO:teuthology.orchestra.run.vm06.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:48:50.164 INFO:teuthology.orchestra.run.vm06.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:48:50.175 INFO:teuthology.orchestra.run.vm06.stdout:Removing unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:48:50.193 INFO:teuthology.orchestra.run.vm06.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:48:50.210 INFO:teuthology.orchestra.run.vm06.stdout:Removing zip (3.0-12build2) ... 2026-03-08T22:48:50.233 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:48:50.242 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:48:50.284 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-08T22:48:50.291 INFO:teuthology.orchestra.run.vm06.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-08T22:48:50.310 INFO:teuthology.orchestra.run.vm06.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-1092-kvm 2026-03-08T22:48:51.751 INFO:teuthology.orchestra.run.vm06.stdout:W: mkconf: MD subsystem is not loaded, thus I cannot scan for arrays. 2026-03-08T22:48:51.751 INFO:teuthology.orchestra.run.vm06.stdout:W: mdadm: failed to auto-generate temporary mdadm.conf file. 2026-03-08T22:48:53.568 INFO:teuthology.orchestra.run.vm06.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:48:53.571 DEBUG:teuthology.parallel:result is None 2026-03-08T22:48:53.571 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm06.local 2026-03-08T22:48:53.571 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-08T22:48:53.619 DEBUG:teuthology.orchestra.run.vm06:> sudo apt-get update 2026-03-08T22:48:53.910 INFO:teuthology.orchestra.run.vm06.stdout:Hit:1 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-08T22:48:54.171 INFO:teuthology.orchestra.run.vm06.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-08T22:48:54.273 INFO:teuthology.orchestra.run.vm06.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-08T22:48:54.376 INFO:teuthology.orchestra.run.vm06.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-08T22:48:55.106 INFO:teuthology.orchestra.run.vm06.stdout:Reading package lists... 2026-03-08T22:48:55.118 DEBUG:teuthology.parallel:result is None 2026-03-08T22:48:55.119 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-08T22:48:55.126 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-08T22:48:55.126 DEBUG:teuthology.orchestra.run.vm06:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:48:56.421 INFO:teuthology.orchestra.run.vm06.stdout: remote refid st t when poll reach delay offset jitter 2026-03-08T22:48:56.421 INFO:teuthology.orchestra.run.vm06.stdout:============================================================================== 2026-03-08T22:48:56.421 INFO:teuthology.orchestra.run.vm06.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+home.of.the.smi .BBgp. 1 u 69 64 77 41.604 +3.565 4.166 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+telesto.hot-chi 193.67.79.202 2 u 4 64 177 24.984 -0.190 3.519 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+web35.weingaert 130.149.17.21 2 u 5 64 177 27.867 +3.418 2.692 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+47.ip-51-75-67. 185.248.188.98 2 u 1 64 177 21.158 +0.423 3.185 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+217.115.11.162 131.188.3.223 2 u 7 64 77 29.879 +1.529 3.825 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+nox.spnr.de 192.53.103.104 2 u 4 64 177 32.756 -2.204 2.409 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+srv01.spectre-n 130.149.17.21 2 u 66 64 77 24.048 +5.579 3.397 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:*time2.uni-stutt .PZF. 1 u 1 64 177 26.642 -1.496 2.360 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+web80.weingaert 130.149.17.21 2 u 6 64 177 28.286 -1.675 2.828 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+ntp5.kernfusion 237.17.204.95 2 u 5 64 177 28.788 +0.082 2.920 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+pve2.h4x-gamers 237.17.204.95 2 u 1 64 177 25.025 +0.197 1.747 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+185.252.140.126 218.73.139.35 2 u 1 64 177 25.028 +0.078 2.024 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+netcup02.therav 189.97.54.122 2 u 2 64 177 28.199 +1.105 3.149 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+mail.morbitzer. 205.46.178.169 2 u 2 64 177 28.226 -2.601 2.636 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+185.125.190.57 194.121.207.249 2 u 16 64 177 32.519 +1.362 3.665 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+ns8.starka.st 79.133.44.139 2 u 66 64 77 22.889 -1.059 2.245 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+hetzner01.ziege 131.188.3.223 2 u 66 64 77 25.058 +5.369 4.862 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+185.232.69.65 ( .PHC0. 1 u 67 64 77 28.251 -2.494 3.282 2026-03-08T22:48:56.422 INFO:teuthology.orchestra.run.vm06.stdout:+185.125.190.56 79.243.60.50 2 u 15 64 177 32.582 +1.252 3.217 2026-03-08T22:48:56.422 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-08T22:48:56.424 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-08T22:48:56.424 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-08T22:48:56.426 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-08T22:48:56.430 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-08T22:48:56.434 INFO:teuthology.task.internal:Duration was 557.862098 seconds 2026-03-08T22:48:56.434 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-08T22:48:56.436 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-08T22:48:56.436 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-08T22:48:56.455 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-08T22:48:56.456 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm06.local 2026-03-08T22:48:56.456 DEBUG:teuthology.orchestra.run.vm06:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-08T22:48:56.511 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-08T22:48:56.511 DEBUG:teuthology.orchestra.run.vm06:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:48:56.570 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-08T22:48:56.570 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-08T22:48:56.619 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:48:56.619 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:48:56.620 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:48:56.620 INFO:teuthology.orchestra.run.vm06.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-08T22:48:56.620 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-08T22:48:56.623 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 85.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-08T22:48:56.624 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-08T22:48:56.626 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-08T22:48:56.626 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-08T22:48:56.672 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-08T22:48:56.674 DEBUG:teuthology.orchestra.run.vm06:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:56.720 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = core 2026-03-08T22:48:56.727 DEBUG:teuthology.orchestra.run.vm06:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:56.770 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:48:56.770 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-08T22:48:56.773 INFO:teuthology.task.internal:Transferring archived files... 2026-03-08T22:48:56.773 DEBUG:teuthology.misc:Transferring archived files from vm06:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/281/remote/vm06 2026-03-08T22:48:56.773 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-08T22:48:57.094 INFO:teuthology.task.internal:Removing archive directory... 2026-03-08T22:48:57.094 DEBUG:teuthology.orchestra.run.vm06:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-08T22:48:57.145 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-08T22:48:57.147 INFO:teuthology.task.internal:Not uploading archives. 2026-03-08T22:48:57.147 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-08T22:48:57.149 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-08T22:48:57.149 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-08T22:48:57.191 INFO:teuthology.orchestra.run.vm06.stdout: 258075 4 drwxr-xr-x 3 ubuntu ubuntu 4096 Mar 8 22:48 /home/ubuntu/cephtest 2026-03-08T22:48:57.191 INFO:teuthology.orchestra.run.vm06.stdout: 800173 4 drwxrwxr-x 3 ubuntu ubuntu 4096 Mar 8 22:42 /home/ubuntu/cephtest/mnt.0 2026-03-08T22:48:57.191 INFO:teuthology.orchestra.run.vm06.stdout: 800174 4 drwxrwxr-x 3 ubuntu ubuntu 4096 Mar 8 22:47 /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:48:57.191 INFO:teuthology.orchestra.run.vm06.stdout: 1046134 4 drwxrwxr-x 3 ubuntu ubuntu 4096 Mar 8 22:47 /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:48:57.191 INFO:teuthology.orchestra.run.vm06.stdout: 1046135 4 drwxrwxr-x 2 ubuntu ubuntu 4096 Mar 8 22:48 /home/ubuntu/cephtest/mnt.0/client.0/tmp/td 2026-03-08T22:48:57.192 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:48:57.192 INFO:teuthology.orchestra.run.vm06.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-08T22:48:57.192 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm06 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-08T22:48:57.193 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-08T22:48:57.195 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: CommandFailedError: Command failed on vm06 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-08T22:48:57.196 INFO:teuthology.run:Summary data: description: rados:standalone/{supported-random-distro$/{ubuntu_latest} workloads/osd-backfill} duration: 557.8620975017548 failure_reason: 'Command failed (workunit test osd-backfill/osd-backfill-recovery-log.sh) on vm06 with status 1: ''mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-recovery-log.sh''' flavor: default owner: kyr sentry_event: null status: fail success: false 2026-03-08T22:48:57.196 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:48:57.214 INFO:teuthology.run:FAIL